var/home/core/zuul-output/0000755000175000017500000000000015157223137014533 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157232545015502 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000257510415157232370020271 0ustar corecore4ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf %?lEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5֛!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf^?·0* TQ0Z%bb oHIl.f/M1FJdl!و4Gf#C2lIw]BPIjfkAubTI *JB4?PxQs# `LK3@g(C U {oLtiGgz֝$,z'vǛVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!f;嶑, }t&&\5u17\I@ 5O? ʴ(aPqPϟ' I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$sޕ6ql?N/e1N2iM6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@t޹na4p9/B@Dvܫs/f֚Znϻ-MHVuV_K2k*`cKxuBG&24T}Lai 0Va(7K#ӊ!,ZDxFQO*lם>!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ?J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5~Yb:.͘C4z 6qe6J6` Eh3ŕS,|HVQ6~ۮ 馏SVL l)v}Yg%1C+t;_'|Y8Wd**hг˙r,3l'^  [}r?}W3Q#vS}ll>ŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ}\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgS˟AW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q r㖟u-| +[~,9nY_ws]ȶM_u)O_xV hx h[K2kـ`b duhq[..cS'5Y䁖%5'32hDz O\!f3KX0kIKq"H~%.b@:Oec6^:V8FDza5H`:&Q5 ^hI8nʁu EA~V O8Z-mYO!tO֠υ9G`6qmJc,Qh: ݢKNw2taC0Z' O > f-`:F_Ѫ2)sCj1THɩhS-^p b~?.>, `0!E%ҏ:H =VՑӄ| Ć.lL t1]}r^nʂI-|i*'yW='W6M$oeB,޳X$I6c>EK# 15ۑdNL;:SZq&``SK_ѝDL 09mB-)/;"|ttc‡-5=VrPhE0Ǐ}Wd|\aD;(;Ha.]1-{s1`HbKV$n}Z+sz'ʀ*E%N3o2c06JZW?V g>ed\)g.C]pj|4逜*@ nBID f"!!*7kS4޷V+8弔*A19`RI/Hй qPq3TY'퀜+/Ĥ'cp2\1: 0mtH,.7>\hSؗ΀ѩ آSNE(ȊD0M^`MDN74Т C>F-}$A:XBgJWq&4ۓflq6TX)ى?Nwg>]dt*?Ű~{N_w7p682~ =WBX"XA:#u-9`x 92$4_>9WvTIj`+C2"s%DƖ|2H\2+AaTaBˮ}L@dr_Wfc>$Ȃ1L-%;Ƅ{dɱL;V[bp>!n&աIJX1$9;[?- й vRCxKVV+#lj@_RL;IQ8ŢΌXD@Z}\QbR9GuB/S5^fa;NwQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZt$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨OrmXMlk1=VzpO֠24hf 1hi D{q:v%̈#v^nBi~MefZF >:/?Ac 1Mo e9d ljB?K_z>p%'3JQK-͗R>KkΤOq,*I|0]Sj%|-Ԟ = Ʃ%>H&t;9|W_(#NBk1/HC?_U 7]OaOXA޵,"e_B&:N&c8IAݔmx?U$%ˎUXD.Vץ*2т&NGIXUgl(}UL6 @ ,UIE;<_ U*D)E\i,2SiHc_K"ɯz< UASVssc*V~VR{KmȾ*LU8)x{o-Ž2WXDM՟F\ g c8`/G3E5{O!`y3!/r?!vb_% dɦ<'oO:%0dSR~nXY__a//34B2Z{Heqnd}Y\1Ӱ#z?7W'o ^WcH;UGGnqeP]' Ps,FŽx(~Ф5iXkq㺽hE^Og~AzB״/Aw%ò^t ~j=*EM0lOD:x'+ֿ[<&" ҽ8vu8B77sM?%_Nψ~I.R >gw.>8B%]|ԣ[|(螯Ahjp0 CܓGųZRku I?2$3_OcS7oʹ Dsh&òkAa+=(oD1`כv{d2r }|dOrm-V_aC`}c8I7:/|'_*t[^{^3Kc,šB,g+K[]x)2\wB"+KXR:ky7u;%qTOXd ž´ .ZـJD2 D/UfpGavZ".kx=^.Vy(U?3ٮxyH 3,#C?M˲& |pc@QhOfI$͛!o:_O&#C&[SIdY 2.J^`<2O _l 󛼔nd~Kli+ִ7N&3Q1(DH,)^Kjy\ ^UI@|r2]|TO.FU!:?WzE2M뺨^OnjR꼚k[u^zHu2rZ [-p2~tVڤkRDM%g%'9y)Ff*\G_$A2 &PB> ޅ?ZxgjjZgE v_r v|M¹x@aK+9??ex[w?<)V<A~9,'glD־7fx6 !%h 7NQĵ {4gi1O|)gi Wlݗ[3w䨍fȣe_GM;,k#YߐSs"-HLR$C40`>eǝ~8n⣬/|*g)jǧms3M]?d7eHZ۰yZe)O{)6xSYq6[7u:]u٠7'hxֲ{W wל?V.ɹ8q}EPy~s3C˥U2h$92A^*]voF2Ǡ>  9`|`oPx6O[=!WYqXw.Y[Gu^gK)*q}H>^qA;::;!!tzHpAYK)N Gp:ķnۘ2XYFKó3>gmn>RJ9p;43 }RlԳ\LeؔC*||ʂe).9 ᨩ]4(۫8M )IcM18 <pICDz{g1(,yt{bw<-RltMpl}erm\ K ϏW&i` N>Xխeʹ 2YWPqͼΧim/6w<=z))֥!/ !"~vau'>|b vϾFlo |Ĵk 9Tu|t3^ IވRy;I Bzoe/f9)t < AelҾmx2iƴ+3@ȳܵ[E37' w'go۠i pm.d ha+\06縬Aȡ>/]ƓO'oIw73J\@togPE!Cph ͔CH' rq  aIǦY|٦x [m #$]U40^i<} bGOFf\,709ۛ۫qxhE9}~WC#+Um(.m;kMD%!x"4ۛO?~ Z/2" 7q o$ I]|paD̎ UH_G}8zzi\4!>s'$/Кăa*.<0tJޝSL \fڅ y~~x񤢉3h<x`zED⫑ei)DGjK4&{ /?GCennp܇y^khbL2'`&x;<y:o} mT8@uv|r XK@mH[( h3ˈ ܈h px@$mBhہ*+Lb0/t"5і*LY*ؙLM]~Jk|!gYuvŁ~&:C+)w_x;g:bWyyݻ[q{˕Ȳ7SR~>̡ѵҭ`E4&Rv6|Fjۓ+ oHꍵdS]ZSY<9zA8 LEh0 F碍ҊHSNXC"XZ*)WCkk'W~kG ~zHnswt> KE0Pjm-ҨA.й'V k>{$dě>wC{5,[)4ctο"Ԟ%UMo:qJmuঔZ]򬊻zީ)0ogy[Lؼk <_s<חc{;YZ|uRǺǫ3I^}a9s UN jN&]Iz(f itBZ얥yU!V4_0TsFvoÃ~RѴ)$ maubjM`)}f4EܵHv`XoAEqfXa`Yr,]_ʛ4dy&akfoRVzï`nI2 6lUk΍"w^ m ^G醱+KeT+izE^i%0Oc1n2AI!wjW\V>x˜uN>rDpxQ[ƣٌÃ|#`P*XiqWWl ؏c?ō#a #]|g?cy]ڏ7y@?f l{WYoƓ20hy;Fcg-n4{&Àg&2٩Dvo~"d?6Y~)Sgi`]o|Ȅ8!t hP|ߤf;E9ltRW׷ax,ˡ a`7ե{4NL +l; lB0W7 ']c:w[SԔy@6q\XOtaGN&dLtuȸ9ܐVfW'DE*#F~wR7WG;FbcPw"+Nn?{W6 UojwY(Ǹ&N|l$HD"HǺ߷)ȒITbfM2Mhyr[D.qtwE7Q*WUQFS+tc)Wτn"M>q]s:=l4K`P1u*IS \?5NeV^x_}Id}DE|I(9,R>9͊tԕqTf?}ٝQ,2Y@4lkK*,k 4os_c!y>908^r8m+?V4=%/ ){uB(v+@ov4L䆠 (>k XbZ!y;gV9!"m/썇ZK06{׻"-\ƢpG(v;̀oU Ym@9f.[B^HWKPy#-K~DD@5pX&DêP\@~$;&k#!bqܯcuA Kf'1±[vJNS("nMkiAjo(taÈmK|(t)[.kNx" 7#'5Noqv#6}a4.P8{ @N[bLoQߞܭ߶h6D ŧ?h5Q`UsBſ\z0GJ+:?ؽmثPVxc %,m`_P2$G}A(mp'XžN< Z=ѳsc ̨ 9%QAX|ӖҪ4ȶ5O]f\ڎX9@B 1|y@M%{BqZ,%2VDa->EKEސV/[ Pxe>og쇉3n 6=1[Pv:"]=m_b %ϳ':v$G$GxO[&*S>=]Y4QKg=-.MikEUр(hK fBw9_ Z<bFQ񃯍'ĭ& P>#"ez{T_)$,^6ZdaѸͦe.E0dqo&]۶EיХDz}|adOzsh4Dg: 0J;hm;]Զ}10$vQ%G䆒BzSÓK_l)jv0oC,f$*{:0JS4qQ0T%80{kMHQ I<>az6.<8z;J$1s&;"O˙oVef]ۇ]Ӕ$/ ׋:$ܴohv!By>4 Mf;NѺ 龞Tn/0L(%I0A^8aJI&iA0LG%EC Xn E. Ɇ~77c/7J+v 11s@>loEغգFm\l)x|Jȃ],YcLM6`VXFaGP["*/gX;z3_fLn3}&yѫpl+~>`@%} X=Gg) 7r|Sv'yq.eMƀEU{1n>e<|PU8N_m(=~~suߢ:)Af?A+X~5S%A[oWx1ꛔ0vm38|Yn<&gb#-04##;bC0BeDC4FX.]ʂGյȲ#ibzFԼ2xĜbQ[Ũ@Q {!=֘S$Fzw}jz'=>0k#won+tc/q[J/> z-蒵JLo'=MO\ſ_#z(!,1*?O"m~(YN'T}>bo`PidA vپڰ's^,ϰJ. j .~$"aB6#„[lRč nqB`:]k/q`mkt UU!wj:vG2B3 ]Qm;pkӵaslmm }ڳk;`ሆ<.a<9MMօ;[w2]jz>dOU~Svp4 #+;&>|U(Osq/MnPִjL|y`bϔQʓ*WH"fHF%61[?nhO-- yEyqr%(b9VG'co+\wO ުXݯzgGxR%YROo>oPm><*3Y|L-9^܌ Ļ$E=Cqb{qgVo_ !ZЍT ^m, 6/F3`10ﻍӾ1$} CtA!e!=O=hiI˸Lr5mT$'m7qdl$6d}E,4_CB#x'h W]fS`$hR$x>|H3MCny"B308;&| !˾X/"-1kt͝oVq0`͕sfyZz90I$ 'ۀs@ћ`-e Bj"j@@]WdTD3.$PqEA=g$%"& vvъJsZe&΋R&w,[fBx'"PuD"2}`̳.\UI'eWKF]vJg$_KG2?9#uUh~d(_^>q?+R]$bVܓh4Cg1SE4gt/苎͋&DbS+=y%9$DuCa4iW% es &P~/x, @ԙwBc::'Sty;oN{oV,9~Pk(;gP aOyT _E1ŝ߄UdĘgw{[H;I}^LqGЁ-X)/CΡQ̤g\%9~Wʑi-y^ڮ0c$|8"RkP$:Jxkʒ͋qgt>F? /a4#:,Q+1Aɪ|њ԰˚^i+C,LD$hgMw^$ PYHfx9ڕJ%NFk@85 Y;UD̮/3Z9ʂNWop7+h-oC!)nB%wZkQ]*kY`MTxB3[WUQ-zdV]*6WH9a )G"_[0Au?c|m5 3B'THM\b!*ɗVKP{Tg+\ ZyXqՕZQL$8X~Sm1e9ֵp!pJKޕ-u$_{j_&&V&HHJ 6Sb[pr;y,ؖZc#N/CB|zIc%r༴lwI;p e>G`5k 1JZ/}Xnm6}PO~{0Oۻ!ѷ32 i,ytFaOxaר9v[l0U)Y@0m.+lReeqd ?la T{XAP:G10CCζIum[ Gp7= ص=lyް_XP]xnn2C.iIC*0h+Uxivò`b'bj:-45?zT:&v30ޅ3Y[l:FלzE!g$QO&;:ca0;oY{)B)Hs%Ko'āsO5d$%( 졵BLtK|m Ce`mE4BCGo "Bnځuu/SRn0@mY7dȋ i>( wzw˂sql2hCԛEj+뜠g%;A8'(R\˂1DD3ςb# 8Ie~4೉P,xV'sȜ810? /@P`N#^9m}CQGp@ҕɩM‚<?vy+:@/軘}O"Ypn~Ww,8q3 fdfBJB4 {Dɂ3&E|?$(v4Zˋ?<$csUa({'SM,*2»1gt􂮀fـx0S7]a94) A'c?IX!$#I1X2\jHOuOg& ^6(1?[bJtLq)f`y`Ϊb[CGWaӃraeuRv5yt"&>^_ R(iZ\7vreҖW\?zn0C/U%5+7 QH%X:IM5*9j v1Hm PAlAFg/h(fI%Ņ8/ߴ+&8ag>J:vA˽`?`̦di䉃KJ ;yۉ"&Z؇lVR\L"KOIO(f~%L5j7+_) ^L9YmHP ^do|nT~_Ao庾K.ȁ۷ @?i8?~Ŧ;MPs^kz~r\3aڰ߽y? ,UDֵDr"$߃XmxYP2{Yp +U@_G"1*HCy74oM6LV;7Y[Pb :\3i$"/9#y9͋Jϙ Z7/3ɂg hƘ$d#؊ӍYq"}Zone1pQq7ǗD{)݉p'/D༶ se;~f_-[ z1y+<Q#zsW{M]+~I;%Z|ڋRZa)S'徳.kg@H*~w @ibڸ</&WgX>/:)cz;X82;o9x ~ʈ =A$ SDx/G*L>T-v e;DﻹO76׍onfb(z9y&kC/x0>%/k]ǑH)j8Wbq3eW\Ԅ' xfƫU ʫ]Ro+jڳ4WIƈᢄ=Ч?w'rɔ@䈌tV :3"Q iO<ȲcArdϹH~H'R嵳 :;ǂՄ8Ą1^2'N9$[f)?s`:0񔖂\x9k-z se:ӽ {5,8*uQ̒ECa+cx?YD{& ;*L (Î(9&@hBQ*③%o7;>6&sf-9Ґa|uP{bĪQvaO)a?~ nXb0[eFeI;eB _i$RHN|Oa,r/<$_cq2=;-U?[D MnL WΪ<l87 P@'4[s2{m%q*sϖ㜹fp`'+qRns`5ll}diԣ;BQF몕VRKҟJn3 sRUcR_}AJ]"Im%ڎ} N^ rʶ$^}8|, CPS$}[p*Kp_ 77[P.$N)dKvɃUzc5gs_F~T7Of2>~6LGC6 Ȣuj(](k;Oн{#aat%u[}y[ 8ѿ NwF̦%DƧh 1hmC'qOw?~֫|C)MvK=m3ZfSboJ#C \sCU/o7MNӝ6X)dʺV;L j$+k]3ܯp6~__{脼( OoFҰlϰyn&nGN?S 2&SRdv(fѩp|<΍љ}˫4vn 8yȹ&$Mr.L@8}SJM{?zEMt+I 'I- N8jPz S=7r^hQ޳ErrWw!':7 _= ]Ta#*i5 _ z2oܖGK麳a\;<,Jftcz!G/h*y*S+=xaOb'&tuӲ`F6du Bs3LUlrB׊a:IJ(yf+Y}\fwYWMnnVEL+NM-ӨD샳C8 Уeʟ۸ҥdjS)˓qMU9~]*45lҎƕ}MIQѤȖ4P><|tl\Op"i9QKv_{Wm㇡f"7oBG G tN~ӳeԨfaѡ^J$o;OA7IVY\. V04!dѾl-* a8b3Ï+f1e'AG%]_Qz~uwi,kpOkW_?.WEi5`|U"a<!_\2@ ;m{s~|*7GSvjZ45W]ߍ~kڦFx^E]eOWAd z 19?zuՑ N.%fP-|8]'7'@P)?>G֨ݚ=5sj=caO4+<`KR~f\~j?:N.|CptF7O{ ;ho|fLVNCQqY,3ɛ J2\tΒo離/:&[ h4[gg$=-teK0FY Ӳцr|]m9B_ 鏇 訕8'N16Sc٭`{_'x#wFcwm®J,VJخ]ns`~9Uqh((P*P!T08 ŏnO:/. Pd|)XdOaŏlGA+W:ZVKT%;Z +Ϻj`^CWBף~^l"naB@j]}/[NErє޸$fSŠ+{sU}@I\8kCevpSbȕI*Ş:P*&7Tʝ|*dtXu$[n5 83ql2VG_q҅c Zj1蛫h'Zb[vɓɯ&:± pUqr~|zwU]?:(f0oJ+my=*Ku ҥ.imx9W8*O_S3ݮ)?/)bh#JV3BR䋊 BiO:>vu:@&A ql?&0" U|` Z}Z}=š.xtHn Ġ|_)u[K`]9c}?)ĭ\z @ߏa,Ia_]97�'r=h8F(R5_x4%]U9)TR6c_D/ +%_\%E )fߏ溝NLO,uɅ| q~dYxdڮ@lOOlr=/~cg}Xgq4%MbqX永J6V34Ձ=(\3BV8k_ɓ)kp]zk9kU6*@\MC)- &i\s# ìrin Ǭ[KV)!L5A<@[պYpiӺ V4p7Vay5f4n7PqZ]DJŃpÚZ-j)3&m`L7(1C؂ Ѭ( B7v0I0IdCP_T%ۡ^qҰVK8ni7kq!UJ6e.qw^ ,+@I_8l͋[ >*w8(a|Ɣ(Nzw|KqڂfnNƳD* 3ꐳHsu7VjrR)HEYzEhY@샰5x.ԫ d'"gz`qS%pAdd:"R'Rmq 9lN!Ҳ];юz4ϬZJa\J `Lf*==~@A#cpʸsFJGvNyUNai3`He.֝t1Elw:ICKbLvIaؖȴ5"lv]b!.ķ%0˶)n@|xw8+ٷn"s0 !]!mPk~يepOA= Atx !>Q,fx<յ4,B j 7h{22b-N"l)G-!x]>'S6yq!|0g CO]]AAuϳbG=^  FT=}G ; {R6Vifb5/ tpNNv[߆=QxѢVOxVVڶy {r>ãIP\>x8l;p?5%'i/5d=D(?U~ Uʶ9?ebTIp20SR302.Zole[Q-[ƚ!0poŭmc/ jm+4i~76 lmAN}zCa.+Y v\*zWQɧko 8`_6n="P1ᛝxjYJxEj%QkA%\9TkO9΢le4NVA"SDdsތ{a"EZ^Rp$+zXcC9a$1TR?4J5-XQt;Okcx4 kNú|oQ |D~='#ǟ6K,0,:b B޳ giWfiגpYչDB?,`0 i:}\!L? #_j7]sSPqbI~vC 4|fƝY8'?oOiAHHP$e I&J~E߽v׮|UxwEY_kvzV'(Xi[of{74_?۰~=U>uO(_P0 FFw"DO*=F ZA{wH1?-""o5xfs\[_~y z'ю6`~ClN Њa|#k qFFaDtUehdv|p˞m%o;VLK7R% H?YGRI>]@gaME$Kl@A *'&TP\Yr+YAs*V5vQ~Y|heN4Ļ7=bco1ka1C8O-!1kI roPoU$ucRRƘs-/Nr¨zܤ&KÙO)bFdZR"6$jȂ48?o9]Y{`$|髋S/ǍDH=jLa@˜( є{Q'mJK"ы๢:ͧwc˃hL}E |@#6'}IqjJ@#8 -O;'Z"dfR-d["L!!H5RO[;%BRakƔp`q1jbhB vq{ ^]k=4վϮ#&cj_k ̱ܳ׵DuGZx,;iR:M$wYфH bL€ ;*˔55!_K#C>H2#^ʨˤaB`vdJd) ¦V/\z :hF)XI<Ŕ#lxR,9ɸ`yXkMMGѽoXK\Qct})&aD1#2(NZ99F8Y CE' qSSkxQcT*eCVY/h9ў+aºA5J($\z2B$Uk2,XshUR)}, p"q4H HB*e0G$MmJ >D8 dS˹'alRv$x2g8޵6r#鿢kdUn7`d 93Ŷ=w0zXMKr"@VůUUe wN&@ښdfĒ ƘBf[N$X"&.+-Bۜ< д. e6/DdHnE >a>hrxJ'!ohVC#Ĵ* 1Yb֮c/k3 Wi02Jp*- k5W ՠ^A l5(jAa#)ly5ҚZj#mM3kc;x4!(9L[V'%+PMmfB&iiQY0Df47)MULꌋBNp5X̦,K2(E'.Jɍ-9 $ϲ )8;UʺHo۬p )e]9"5J;jNK,)VXΒfNU[6QLTCk(Ye^<*ͰV,(@X2S@z;|+ ʷ4k?O`v?q(yX i5ƐV1aXH]J= (x7vwsY$Wwx(ȊS.xB&kgM ) C~yndaAGKK]KlRYN)cヲŽB3ٴ3/-g~Հja g5wh@:@]6@YQ;$~ͥ iZv}{"Z~'e@EϤX??r9 ";q%sJ k5XBZ ִ;Tܕi^#vqJ'-`1]mk 9 :Oq!ie<3iV Ծ~V V1)ƀ,ز`,+[.)mwVmvq J FdTKĴRXcʖhD׏㠺_>_w:wY wu2z@9&OgB Ҥur>HN{V9ٳVovIcb42.ay7hz>?gɻj;5p֛3#"\qE=V:UOe_Mv J@]p/G2VGL޾'n״g5LOPZYO❵j"oj~(M6g%[Kt|TAlqz^E;͋'XIVjP>/W;+f:ɘ6DRƕ_@~L9}~4rm})NR(w{BGs}}A;Mt%kb}CUl^^!?PU{r(U֣:TpT~Jx Sr4]νUrCU p==n~ǣ?ivŸ^<1#zpjwwLFq?nR.OgL,O_7y9I~wjՏ^Yn7S>Pyapw~"iwu?<\iVl^҇ήgW?QsWn,>6:ډnB^Yl$-xRQ486Ωs-24i^ѯ4@}]* D)4;6blxvoL 'Zo7=AÛ !\|ц$B?STU pFa:*[/$)᰼(Ϊ:d2=sG;r(9cKȑt`nY1O?~ǪOvfp_'?O8X䓟O( SV|&Wl'V.O?a]rw?L@Iw'M WN~ү&eOo׷S'QI~f4s75_1˖VQz ]#빊0r'MvCcEhFI5;!5(}ɟ>7Wa5Qod'Mff83ǎ*>\{!s>,T:}9w7I |ӛݭ`{1ыg6kχa"pg[ Z[m[[w{- Jm5JՀi+]w;b(-S~h$ײ`neZԦevۢ+T]?sf-HݰЙ"Y$ GJQpqI2ƥ)7~',Xm ڱ~砕b7/mƙ-RgFHX,LhPp%ӣS ϥ1Zi򬔲xJiZxâVX-b>\@* 6Ga|-E+F`67k#0 ٹ;UО3-h´`(|FPmƍŸZ j[[R/I]op+JvQatƿnݵB:H/.6Ǫӳd|'ٯ+>}3˒w~|!+u;W ݽOx)"7#Y$GEC}RY4'-m>bխo^F57q?q%owV8M\4+bq`P*Ukw8ETԮE*Ҍ}Qk˽/C+AsZ&N`'nI PZILߺzCRB# Z /?q5,Uf bIOZ!=J6 `\, ɸc堃ٻ"̾ZD9ڛ7ʣ.ZdN;x9m׺YЇjWBKjd\;sJxV__J,# y{Zmƿ?U0(_wb *Q] /AWp>1޺%EӲ#.XyI]GwO_#=mN4xzd4dX2XcD{Ancڝ5?w PPRRZ}s)Dfj;bW #|5#6 عa5 RKy}ph髏`~+޲'n浏B {niWRk  oT תǺ“yuc]ჩ+쵷%sȰw}`C}`uD}aƝm5c]+LaK 팝/Z HVkr-Xut12:V0~Tݎ],~aG^EO<ʍrr^e.Ĥ-Omx9x)ш VLp)jSZ%my9a%SuFalk,a9>)8WCaG3} kqW;HB-j?<yUW;7L.UXe%}*^KU"ʾz,]**/*JnЪ%oCL5{^/ z&>ӄ: m&>]gXNB4tYOhp:LD% E!’YM$.P*2& J\L KyiI4~މ;Ƕ[4[E%x$sd1WˈG~f!lOnVvd缾Ū@c_&pcTM+?</XoqUAǑz=|-ӮGqc魷ífsMpfsҸ<PZ3eRfKX}JJ%3ЩM8)&j1{), ld΂n=Gi8iΌdXD ϋ~'zp;~- o,g\?};ܯe4$^};]B$w3OҦf=ynzk)[xhP2[X>bau%|r[,?q_E 0Ղ3 +ϔieFl"5M<|tg@ R67d2AK6袻?CYS;tѝlL^t̾ ^/ e䖨/ e9@ٚ=m01@oκ)9_DhF6 $bIM$\֞hJ$M!GG%/ Kx;$y%ۣҞBX# TC0nx/ sG)fɝ%άx6w _N]Ip\SF@BιA԰r4hRՍ%W gt*ST]kP-=X,T]e=Cչpa@5:8Ld2LTJH! ʵ  @e@j F(E$ҾKHl U>TQJF %Z)Q9""*H{hqHD4%"퍴7PkYN &HGT#덬7@K(E+ZS#5GYop@j6H~4 5EYoׂ V@U zr i70iuW^%E<"JMH{3u U\r~D EPio^ڑQvn 1q7H @bLQi7h/j*Z`GRho^,U0.AϮ8#:E&镴+ GS#7I/m*x@JHXo$^ѰXFr.@UYF"u |Lh:*f5 Lc$TE/(jU~&hi6 OM\4Kwü덂Mpn1y5+U' ypm6"$4/> ӏN]OzƗ ('"\k)0s-:T)RpL")En \rmHW!*,m?TWu ].g 9Ɓ3ӳ+Xk{p=#iErF;M|(nγȓ`’4\ #sFY;\U|M,6vECZV2XkTSS1ze@gQe|]΃*C>m؍:C|iP ճϜiC8Bek\lZkVu8"bLd5Hl幆*+k]&8a%G >RE#$)>CFf!4Oat۰ )NZ =2 ?5V FvZ3^傰!0nGCؘ ,!lY1rMtqZBq<uMLMںCm uc_'nr =O7lU1IF1Ayd*l@&u ݚ0P w{T~Ax@BVEE:8ɡqtߘ 3}WFf^9';$n[j6}̩܀ HGox|zkJ5.u[+*5.sCpn;Q#!IKc}xre/4 Ty2clgPN(h"߯:)`؞{e,L(qI(n!:@ߖq8&3 P?Dl͉32xJX*D,qƔq85gXETCe0.#FD'&ZebN}+'y|5ONMP¨!/M|.!>md>`A^ >2XPa~/wUam [ת&#!KšCAT;qF194ޅ}bVF/! bFf1m\Li!6>)XB,@Ee$W˒oYYR/~~&{Õ<ۇKZ:~G]&dH m׵}u}%7dF^[^,E=r_cm #x3uΞ oQlwqx#.x+з [(qk^)@rVFiWX2’J6’`Qb9S Oy7xOw 먂gUk߆Zk!f𾕗T卧6MlBMbwx eوY52KZWD'\t*qgD/ms=ʣ)gb';b*4e/OG/翄ei9J | O 3jd6z*/"(YB--Y3FY1n%_V] jU)v,[-Zka{a k2v,22@uem$aRxoI+a2u{XzUQ*YUF,) &-|9_^^69_)L),!,2nd6|qֹ2?Ɨ*BeLr[^[~k{56nd>KJ:i(dHp֦^Nʌpp괌Ns6ij?I-X92-5nE`#;Y&5 LB !ҜŎQ#awT܂EcJE8x4 hx)0y 32%|JBCdlɧ;Ѐ݉xcIs`B}oƌ̇7XjU-p$zx Lh 0MIĨٸ(O%q9'1&.&ky ZﳒFPjFbGJCS@=ؿVX1 !mśQ# [%ٹE7ٙIi.%#IhFOD%/E%^D%{F%JWI]3 Uce &6mCCL!Vպ.ZG8ڂ>g"@&1#s+4}Be_TMjD[Mm!r!#gpuz.kQ`+Ѻf1ks1nd6ڸ-a8Y6xخӮtٴJ s$af4"U1 S)*8,4vf蜲 *9egF h 4^FmT/4(x`K䝷ֳʂ֨@K{UY.)Cdw]3 5l1^L02@ /XW,A,8 v-r0@J9 #՚b:8U=qƛ4#$ߝҌ& XG8Bll̇ZJf8YӒu%Y3Mf#[V4602xeny"FkVJX7@n5Ms0P S {b#( #Z5E]؅w<4wWrJ3JO^>e(^gj|PXq]Bia6t%݇)FOc`pV裵+V*-3Ҿ  ptư5֧ҁma7whgX\ed>@"O([9T(@SS%b Ty=ͻ.Gg]gPʀ9]i+궩 opą9-kgzG9;DžǍDž" mgW&ȣhr=&xe, Xrt už*Ed<~q vJ܅px~-hQo+л>J]jz|; u\ʁE:jK}$YB [8y_z\}z9zGNo;3Vo?6Z,kq0΂Œkm~d/%/3H Ny}Dr8;jd>O7@lL,PJvR)\Y "G턩Ŏۘ#>) *U#g?82ya 9ڙ'l\)9NDn\/韗:fQ6W/Z`kM؆jXRI_VV3Q^ܟ-On~{_[Ƽ\G\-ޭVw7o4 BR+ͻWvvف*Z4J\P:FnuJd^$}z~wWq~ W]Ql5rV6.wIꜜyhs&K'90+ @D{?Ej R7Dnr?dartyee󜅷q+_^{pP"6uidLi{ML?  y]6WzCvNؾ_+}]2֙"ݕ&eZ9Iчavu&a_m0OͭWq*NƏ!4㍐3G!AQoUoepGOwzyǩoxe[$7g2ÒWr34a.o~XKƩ2}k2Лk!Ulwg'2%/ګ1]1c2%j0 Pc̱r⒴5MZVy[h/s3y*N8ʃʨ(Arb<.Wh}M<'wE0Xc{ FC (crI0iא#>:5>,r])Ao2=db_{^+Ukb]h(`¹kdR%yhfCau{b[Mm!rסVIkQ`+:!@s]7ɫ>hH3Frfw =:驯<ړe*ч-P+jlJaW}WoG/‰tcS^j6|*pR] kB[]L dy8m$:XΖ.c{**Eىg%"3mƌfBFQ)L#J*-kP VYAd $S>pN6;%SآbOM>R;2XIhh]oGW~,:=#l!&": {qXɩΥ#N> 톡P҇ GJ_I>uێ;/%FʈU,fB!Ȋua:SJ2"P@ƈ_POQac3D涸A?b x% l5\j8 gsA)1{S܏=%F8W͒2zdɇ ˆ:+]K.|jfDO^A^_|0/0v.Nrˏ_ cWxCefMv6ˎJK|t! %/efH9uXɻb"usx:$*Aw 3>bFNQ8U=27y&ͿC{\]] VG)m)1xY(3(O N05Ӳ\}^KXvCk<Vj).˸zJlɢ)\Hg,XWH%w<O  GZIEdk2f-Qḁqͤ`i|}o uӋa:O^ o{Lp8\\9gFA-bD`r<Vgh>b#D48n4RM]_&qzƸk89blaƝ רǕosES'> %hlbv>0B\9i9o!L)̇th>L7ZtAE7QYCg@?'Im^6fqϫ Z s4h)"iZIܖ(Dv׿9~Yɸ,K.$yq+`߷Y?"oVVxoU;- ;Ӳ0SIBdxxm?$g:gzHS?μ\F{ߑӻ޸y+iV$'wy1a{i]d?{ߝS{ʞC5g7Yma)h6-P͛YQ%לTmyzj藍Ewo4eF5VZvKEpj饣t>\ ?MP/UqvvsQ43 3p+OR:nd[%އ& .?"`ҥqvc^*>قޫ{S."I1Yg&Blc_rCvSfwͪW8uKSS}T{`?KOsָNO~.hmDMɺ˨|f{o7IbtzJxB "*LMp9o e' (=$MWp.UiGKALp()\8%%( -><΀n[dI &%m8OiRB.LaX}#\zuػj (MDn:-]~Y JG݁mNa~=zyEG5,gJ.(qyo͈س"m5χxWgC<@y|A(Ʀ("|CE*VO4QgUA ӫaL|#tFEt99-}[|L]%X^ kIrȋՓ[x=qVH:|_}}Q&ԃrp<WUv.gr=mT9p[ݩTX[̟-=3=:çpv:6 ]@Ma(:EpeK2QݻwROνWPMB6Ww%*ޛJjζ{0wݒ;:_U#*H>vJ !WS?|D&򤞖Xܨ%'oqUnD+KU܍ٗ-og-l8S=v󋍒|ś:fmH/i.^>pk|H.{Pf>A=_k`Uhk'Sb;|B2V.P/{lOSq@i ~l6OϢyrGg΁щbt692/V}IU˽FVOBt67')9Iu"ſM?:&yac”;Ι4e$ BqRDZpE2*OdNoMGeVCb&8Cڷ &_N}tT@G''NQO VDSI΂6\B:"2"hHݵM(e+5` M!B<ąTI&+|1PQj}G6aS IfaT+б$ A,YjP- o%s%).1x mE+Vb2CLa1x<:#2PWrgetVm:碫"5\mTRP0t) DJJ^Q +#.RZ1 Pwb ]!(n,S ^YȆe( *$m7sDUvht`/`ݥ^Ͱǰ4h(4#j]6d9f'I3mbAc(Jt`iZSƨ܅4Nh%ROX&0 Z"C)!ܞ}&ͥnyʼhi k[@ -T(*r_> R=on@ӨTXB་H)6)SR)\iih/|-%묊R莣.rL;0aaRzⶡii+ 6QhQ3/4zQ)8QȤՔ"h8 xEϽOKA;C=d()D;˸Qt%zJ0r`BZAra+ؔh8d)% a"E^a]BAKm )_Ӱ£L) V#EFqjspUqa$p}P50DE ^tUB$J̀dRq Un/@.A y\hRpjKZ2@E(QE20͡³o`H tXU!EADaL<4o%wp1NǀVq•y , RFl 9H_ p/4 0 "< f <„Rw j d>2"E00PGK^d x\ ƒ gdd+X Hhă4 U`#$U(RBB8smu%!gʦЁN)nf8VZ+11Q ( 4JKC $_ftq#"z|a f8g|k[,J }vKX陖#hbXY{%FvI#yǬEa!C;,B"`t &lL@_Q7*z|-y ֦JC'WP-J?l8"cޕt[5 ]p*q8sViCc"Rb/l 2.-PJ =H;@i d"EVͅLɑVha@IH^'i3j*ǹ!+7ֈ Q'tA,TGg4!?% [ź4 ` d A /KAH(l' ˬ]n;n^ dHIm|LB^`y \{# CsxtYi?k9H"PQBޅ\K[ojLp5@ @< haaUCF+=Ҍ#q R蕰(d :z}Š ڌŌ!H̘WV"_à .Hޔ^icz6 sP'0hyԫ` ?x"Z#6ݑ(dDBIåȪ[0_QP v)B&6r$Кu -Vy:S4jd YpxWDjW*bu%[ ~[d{n$ 2 đ}oW9Jh>1ECNjWFnq6a1&MHsLϦisULX@ÃK ׀n;Uq`tGyHVldL8v=EAߎ/f1LP ,:c+VUYhvt*yQ'_ iᾫ [;fcer7,J$ɪ謏ȇpCaBSTٹDA%\pIbHR#!@NT Zz[r2,7VA+DO#ۍ"0U*gm+<91@&rHn`J |>`,E9\$$򨜌 j0|!r)̏Bc6ԳdGɵL ؘkyTGjR":h*@T,3+' TCx?]st_W݃2u,oC>RAhS+9v08kL./, ,FA: ,xDj!]/jDұ|.DSL Ǎp9>d{2hGG!e(>&8b&e9EA~a4Fڬ_klE-ṋ!2~ׅ:(@q18( _yr`J Az'Q[e&UqUW.~1g׵~6[MmcY2Dp'\HZWS܋9'$$M~ߔ~mˋi ),90ȆWd+7_{? u4tu磟f%zz\^,*SS[.rOq b|ͷ=o7i/iQ~+='d:YJ QpcOuDIS:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:OÐS"0E u .?Oyei}DHi QOII^:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:d:H9%!@ʜ Q :RB3u23S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:S:Oy;՝={5͕yM{־?(} )E[/a;#F[6-:4j&|lū8KSYC$(P@`f,".K2C~~?qE+{hvr&|qVoJ)DwTU{CXԉw"J,JW'"VD|a!ǍGY767F"s̺*iJ+4F.޿=[Y[%RC泏N+o/?l]C^ w|v+טg}p6hX^ &iO d*=gn@1 DʷzPb6I1wH˓ Fz@ɀ4۝E{b#f''!B)Dd9eXS\<+8_sn6x#mتF?Ԙ!t: G?6Mh6M5|7l^ HH kh`uo;V\ &-sEoG :bIEKRg'd=Q*xt814 fZI6TI>$o-mkJz"UĺXLeru L=Y&{ѐQNrpQ<c=ܑdn(q*N"WT@-[D?- 1_d!. iŃ.W)]as_˸=a& $Ï1i̐bڲǵM}{(F:ugA,n. H>G0jn?@d ]Kdjz*ei퓿Rr˵tknNUuz*`tڃ},bGmvw x}%'H=3A%8o^ǷYl,<߯kvSt&S~í Kx9Y7={ܺCK{/Zwi6Wp>M{z©}vԭz?^IeAhlJ掭M_c}}WhkF,|Hıi׿Oca{Oyul=X2ijϧ%oH]Nb:|5Uw]K<=]yR|>m?˪ó?_NwN};7麟iwsGruߦ}Yk M[h+dKKFŶүxLþ|?[$8\~H˯Onnɍ֐K mkN!/b1JŜPv]y>@٢I;gMϐoܯaeJ]Uk+A+X4ǫa!-0c?ĻZHpN89`,@l-iO` v'^agkcWa9,OLPFέ ݈B~ ij]43|cGM9_Ԗ!m:mÍzoZ6n`d~bl  8.iNӛY'ͪd>SC#m_Ҹ4գ!BBhhpj)8|?1T"s截(de,lk7ӞBӅh)C|lm`_ɝ N?FU[5>/o|vP@۲!'XڀH݋X\ţ l!tq/TL:P'`QK'UÉp<#z$`_GvӈN7;HN!;G] Jnډõùu+^p(WR{Tok7w0:`NfӀϿΦFku'L\5ml*ZS}yjUbl]a:jvL QEj5F"uV*jgi,ay罛3]k]w E.f\ܥ6.(b"#$Md/DU%Um&Z}(_JM7]ۯﯖӻt}1e:og&0VLJ?2B&0ɞwlG]|ʑ=UW9'|EdFS]9k飣~j~PϮba!8] N<qJU "戥G#^͇k`Q雷iv7Ur->%hqG|3Wh4o"v|JYתxLoϾ=8M.l`Ώ=a8X;y[ggO}[ۦsh'}F~n{F֫͌jwF[1ȌJ)ɩ>x}ll>oLf&sl%4|{iKf}f+Y1O!{hz5}޷^h+Ɨ# =`Ag$ ij"TI$j2y۫iy1q!]_ S$],%Ou8w*k!kIn]mi>P=~pZڼl[$zc1 JK7ygbdlu{]y[}7w-њsvg8o4ВXJri3n׺dGխzR; wO~kԓ5RƻDT ݡM{w~<5StFcqa C.Av岋d˒FߏTK%7ż RuU,V~%)6"^~x =4Jt݅IEMtkEPb%:D&m&me9mB5"x픍&+!# %BѢ Bh<_ƠC*of-f<- 4лAZʈ*,58~\!nĈ$U%ee ;mB~-oG_- 1{Ay%{pDgy~ `S"G5F 5j &fPt3A2\~qqULopB0Qp_DC\џM)ehr#ڥX(d7F l7ћ y]<}GL]'V> ,)i4zeuţ51 Xw<, J LKJ`f*e<*oX\@!vP:l!Zoew}S,?rn.O䱘?kP\E$"4Tx)A%AhSsuX樇Fe 2_G?LwZD<wV܄ )CױB$TQ-xs"yU\tɇZ2ӽSWA4F[ ѭ_rͲ9U=>Y.#܈Tf9*̣ʌ@QVwptuZX5W)3<-vR[ԫhtMVs@2BnEhmC<79`C@4nUl/ƛTIHUGeh8w} +YNF Tvm4@[TO? e8 5T#U9Kj͘bQn[w)LL-f|uNi]?/ý]˰?@}D Zd\,. Ȫ>@\5͢ SP!"7NCdp_ (8 l,_'Q=tsaǤdSx),ѓſ(h-\Mi_+[Gv[}V[4WO=&TCB9!F,$@NcދX!Bić/@x_]ʺ~pJK"umPqBZ*('$ GCس06U0( 8 əJ;90=u?U#$-+L@uCg .4/OiX]Ÿ_p&Ц>n(ڢ'(U SAŽTpy$}"y@nig>XБ,8Ub3^fb<@N( dܦ>k_e1_Q Jk8SZ`J ,ֵ2QZԗ._0EL "ue/l7tc^ɼսK> و/q5 7eϋLft OnxQ^L`2Cd--d<{8 ;mFGqз1=| Ji̘-v7`B"/x\TwAײ|'| V]R7d|HBx@J#ftQAW2݌8o?l}"kQy) 4'O=4 ΎTx:0oPw;G4Oa/8:ȈWGܻkXޥ"#X7=},A*JFO-FьRfQcƽ*"nM&#PE9cQFOmi{x؃_rMG94F A"Q{W|&N=4 ꤘ7<˃{r)VT(FO(j ÕAuAW \43E*=m:y@W2:,7/W#t  R +m,(= xk0ŵqX|V/x+ VEWKMS+2GɣPmn,'xz܁vEFܽԇLᶓL!4I I 87ɴg՝S`1lpf+4Vh ީkKZʵ]6ڃle +=좢ݬP\gh.i="0[qApK<&XB%MHE:$^1zv;Onob}h\v~eyִzǴjsɾ;jĀ:~zٲ$DAyW[].Zrq;B r5҃Û[voG0t;6˸D cKt`<B!O e' ˗uNԿ|ZǕNOoJ'œnn}AH#9 1>I@Nw*z%ՅwP_+y@g^-r:AIqP\t\RэEa}:F+EpLp-^[-}dk)W$$( / [Qp`JdTbl'kBkr2n$^?ƴ/MoN:\Z)Ty%"GZk)UqcG!P >T C${N!a]Izl"kEfP^n9?] @oC%,&2W0--1i)WȤ 㩓569{k\f7;@Mx:{rK' 2AF/FΥT;̣FpU=>d;''eBO :~3O y³~'@m_`O& Oc'Wi9&!MHi)jLo=>mZ7h> AedNGq hNL|&>~ ȉ^)BҖ}NOVPd ޱM' hvnU^fj9OBlf6nO-(ՆeߵʾQߓ59as]ʞU-&IƚPЄL9.v nd?fä^1|~1b3 {( ~[ v~~Tj2lOO'~G'J%(^GKGG8j*8u*ߥ{L 0[e2m1?Io ": eslm'QEv:y d=`q]?W?Ot~?hjW_v|ib*(eO}?*_S<>S@w/lPl_b.">N}P2? .ST4Aͦ~Oja~Wr>ՃBdJ*&*oRc3t]k578jZfV,.".->y<,&^' ajl=MVqzy:=)_]˩穿w.MU"tfwΜ;ݽ:2y{ag?/֛u6Ouz}WKx/6W5?/vۥ7&7qcծZ$?~Q{93ɻq#r@8-R|6,veZr߯(Wܲdc=v*Ūh^g%9}|?,m)MϋϏ#8;U~]j?=Pae:ok;F0v=xh,{;2nǕ:0V_>'޸ˊIYǪwwF>WѬLA姃m4QǰJZ`0E<1z\ >lcVhf k'``ԫybtXR} rsji#X&[{lWK+*CCqzVS}5[!TvϭZq=QΡq)F/"aќ ua(K%YF^5 ;V_u<8/TK"\B=I[y: {xGa[qi'{L}V_"{w^ ZK.Ad=)kuȈ yN(cy+hE=k MF@skF+=+*M{TyD(T*$tU2$ M] Mv1ҙ h֑"RΌ {bxob1ķ5VvK5oާk+@mQ=}ƶ VkRg4g^7dUh&C|bJ \{ U<`#uvz/-*='B3"V }E]Mg%1O{H<)'; =͸gMaxbXo'RDta7woKѫ(x.p^t{mmA۟ ^E"mV1 hU&dDf9x ^h9Lm@…KٜboFY_juu8.+;խ hDOo=ޔ`ZYQK h =QArZF * \c^J*:G>OGą$TlEQ'h3'r7a_,~<Χ:<0X6>%Dqc:ißPV}%A7. y|ҫ Lm+t16őDW4jfv=٫ШGd h@#o n@/)&`t*8^?}Y-(<3-Ft0` .CZ Zs hq-h_ʎRXH|1#Ƨ`6Ep^\XAрGrMyg޶E:Z$G0i""oʺDs 2*j"kByK\/ UF4WU*G$謠ByrD r 8vuҍ v~˴5tҤ1Jp3d@<8PmaV7•/fuD r 8<7YHw$0` /2꣒@16őEWNFՈh:Uc_-k֨Mz7*I<GTCoi!ZXD弿|WCь"o 7"POwU%2M_{7QNj ʤ 6zyװF vcUs <%vẕ# SxFdvS H\S`fRLHq3qG![]kDH\kQ) 3Oe2Ə'͇nͦ5f[_“@_)_P<܌kh]Y$u}V._q1n-#"E+jx̤p]E7Аt}XnI מ1,HHM࿩!>w^CqS8y &&fDޟjw?rKUzLMzL 1IY4N 3Ͷ<.fSbb޺S9oו[q+1GbsqAefH #ɉ T"@g_,W N6 N19h5ZUlbIp; & 1N+, t#Mf[M8@fh5\MNNPqg_u }eb;E|лǽlԡޮRS~LxШ\(yK״Z2EI`^L(b` Gia K KR\ [ De>"c9GryKU>r$" e MKxNX)inrCdpƋxƋ"oR P9Q+KS,$8"c$ FZ'Rx<)uNșAh-=ާ}K#@]r ,V2.Pm!W7g|D6"x4m#%,:p۽2y3PyIݻ)|SJ ^ړuN7*ߟfb7P ?[x0&(. k>RGaQu2{"ahƓnG$,{9T0wJܧS*fk\0I&ΑE- ޒSj4[oe4q\u,0LoA2k?NOц/L~N#qe8+Ra h5c$5c?n$IRί|-_fv\ֿ2|ްa$ ~Y I—>lGe\x6J`eaୟkzdzb9[8p\E] }sRAI 5^zl,~fCk헤'7s~GVxn⧠?U(->Rgm5>.hQKE1Af:s_Sj*DfvH=7S ! -a Ux>_`5xR \&> >_m9q0~@<|<ɮ_Ɠql_ 7Q3ΚHf]`jrW 3f./ $9jL^n6/uykRpWoǮ_:Bj6_rj>R.~PWe/M)ظ7(5Ɣcc;7-p?XXoG#Mb?  aYm⽱щˊGh(0ziywG?z+Уyn4tMzfk?@X)Qfs|?"w)cwiKbᝪ8#|4yV9j5_ g4t Iִ^Ϩ+Lhi:| U~DRƮ\gq e3qJ6z6DȖx+ FIWz-_`| FQyy1i<39̲ B\LcyG*dt`YQ 7 T +3YU-O?Θ:T-UP$t%7qj6/g8lM .﷎^^)B]+<"cpTU;5Qfw[SN|uⷢ4}ion;yGlS"n#ZmA.νhθqNaWY,9x~b+yt ȩ,@hY c2xvNYiX]:h0.,+AXB~ҥ;Roɤ*+(u\wioq˔]8.n8Kjf 2 `SZڐν22:,T"rOnVl;F!pBPEtB6C)Vӑ:bI 'nS.i YNr:%/ix{r~hV.&)hpQaD:ڻТNQV^A7L4V/Y$Uf \]a0 9LT`o`&CۯzM*vI slj9Bg'aCݯseVN .x.TA6VT~I\;@_9.8Lq?7r#ǿ"4;l֣$U&a'Afv3XgKj=j)[*M)XEX홰2q`F#ڥL"W嬟D1/k|Z.8^98[lE'rge͖FQ4tz>}S97p_لFa y c8^i?]* wAcjxGYcf0+)!yHm%CWZVmɵy%̜ }7cfXS03?+_.T>liUuւg 2VaX-Wv/`O}uY´ʄ`&xɞDdd̡/F1٪^L2&}J/&Nu*42xY1I3rP 1>R)VE`2bE*` 8e +2Ovy"6۪dL+f>F53fzO?Gt7cf}<1F7RpZ[S(jUI F6#kl~YL3ً ΁ pMp)~tP㛦}9Yuz^q'TUf!r4VzW|!QCHffMz?/GCG [4{ E,Id`X06uE5rXځ::9˳Y<*2 `;LwOu|zw:Ĵl*&켔I{ʢ!"pkyˢƯ:4,ƙyg xC[zk,d"MndOSrhpM* CޜG:tVt@53 oK_\50N?3_}ALfqPyG̉ )j h+j$ơJiLY1<$m6yzyN=g~1m8ff7}EBx=-PO`p{qYip6}d ar1(yȋz"+ ~̙ШGc 3Op{ˣ4G1j8ff36Ga(TœhfL}ٕQQ2w92͈ڟ8sn̙xNj+Gfd-6WFJƢ5jxo;a4a.QpHCF(3L+;#nap{1(r՝+U,2X-PV$c,Çg`\UCFĦxq:<=z΄GԨJ\ϣ1rީnLϰlQgѧǟ/4ʨeƬ$#_Ζp وLeL\;7 U~ut;"J13sls<0MFoGꠚUXQ5ch6`6hMS{mUoM>vÕ1?+Sjm1rSI-[:=*/>ǝa*U,soz]mvVH*VoSd,iIaTײz`U|tyUYO*8V-6z}yy4$Ղоו|M6|MT+g} ؃ 0޾gTKZ d%8z n5. feR%u?~}{y_$i9-լ,@VjjFIf[h,$6j9#wݨ405__nUډ jUM9E;]IpyPˊTL"2Ex94=7:ʀtZ~ܬ;7i :PI~D ~T?dXyu ?/w1Cd\M+"K$z$I^Ex 3̈PC2;='E',\PYG 3b#$k]HN/Ša4CO2RiO E["k#ᨵ3b#(k\>urٹ7h7v{cƊ%Fg{/k w$FtF4OSձiCBGSvnTxhTk.c."4G%M'rl>qbKݡպn$4XXBnwSTQ)fFf(W>Jz6jsA%%q*M2; r=MeZ(mA!vaÌ݈<$.r_$cڶcqJVCϵ<D+ 1Ab$249QCpǰza4C GE۾;\If1xݧ Z!>)IiW9;m+c58ru~@Sm02Kf.t!Fw Fac5s5+20kɞAawi:\׆1C{ ,9phR?TjZt[=FWuӫ nRVO}Nڞ1-%M$AQi-U]u1ߒ%$Tcj=.;~\m1xI$:VZۼn0,,jdzEhC`..@@GU׆യ,1Ug4P?N }%_#K0)Рkt:b6ӦESuU_M잓Lи \*ԽcϷ WB&1_3Q購ՕFMVJ2 NH*UukGM'MŜn3g ԩhm*xaݑ/cYz<ܢStܒFԃxQsnRUfa^$ Wc [}. ̈q9b44C?`w^+ڈ `]eaJ:E()nzx#0JvOsWPV<^}rFΙN\o40) * xԥ̕cѰ;e !q"`U}D(R"4J@XS(d'lB=3$\廚#;bjẄmXڌeڶLM׶ȔEPY2,S ,@LM̸}! LQr,]Tg{_'N}Axj+3úuZp{ ,dsuMP0JMEv u͟`C!g)H \8S3bb_)ZF! <2\Q/"95PLXe-,s 0"cSQzY=O7Uj֋(Kbmc PJvQXӢ[lP`'n"B9\[1pQI`7NR_x%kEvi&c=ң:(Y%73\i%0јш{91QGS'_ɯT??1ԏ`h.:54s>)nlxW?ӧw궀MO4^ʥ8FUldȍQ2EnDkh<~\tg<)J6?nn7`.uů8vhZ63m'ZY5ʞJYJ+rLIqxꃡy{h^YŠ+@UfƜYVɧdyR:wI gDU.Qdxorsb LBD0% $Rl3kWB:8r-XKةO\}Ij֑dcɄƏ7 Wpu+?}Ŋ p:A? eՌ\qeo%S_W\ΨZo%w23TxK.##Bo)z68}ՠK\psef2}{Œ2WXDZV$pNb۵ځ/9^Q&XS( Qy_cSl2|УZkӞx\0sr\N[ IhlԤmMq2\Ft柘mO8Ǜ|b7x.^p:iM?s;M"ߵ HP]JBlJnS"hk=1CL3m>M⓯|~>Oԣ}>Fi4rV ƴK%mZ_3,צR'ؐV@xhFf<Џs3n`+"%"M>[HK\Vu<*qhm= 'l =F6;dxAD\]c3ی "W~S*[. i 6;0 H#>Evq&x{w#g2P!aR!-!Tf*7GWuHcѥDOGeʄqTqV`J rجd@61cxо8J3ɮP59L'2#4ݸ72$2QD$esL^BUgǿpWufA탋]H0řj8J&cIFdDZeٻm$%Y l-{Gfx$ʒ 4v30<+ɦiݶayvORT{43DUZzrUD˸{w/Bƒ?݋:>#M>p6rE 0*$bXĝ2ڥYe=P%( E$aw[`*1 P>da~^؝YvY9gg] mT^y>`aa@t ĵ "ISJL};>_oGO(e}벘>kBi#̍<#|7չsqKގ[vƭJ̓f^)or f[?ӹTw^Ə|o?ʽ_%BXhwɷk"EfflO/L;Ly?{\O>ȾlJa2c=^ࠫQGƓÍ$. ǘ<g*hQ6nBV)\uD 2eDDreIޮfn0m^tK @U6_~6-X*-.fI yҖݟa2Ce}Jm"XL`)C^ȅv~yẁADPc)TKTq~ e*wyZ\}rǃV,q+hx-z?CQ_؅%ǯ?akˮү⻍0@L:=㳳u(vRˉ))cޗNGxb |Pt0@PdGlOm?gBw/qUhn |7LB_0 zx]ˊLP{Pqh|r4 ϡs|9Uq%~pԸU)F&0F XE&_[Z)8lB؀gG fNGǝ@MRDA8(%]OzQ8u/Q: %re^A"ľH>Q/mt [UA3+ a115rW%Ҏf/rYq0ƹ H}3AbB=mt NZǦ\)pyD$/k0rf!>cS(Pan+-/J꽯$uvy>¶v)dP$ݐ+"ꅊDƞ.zɝ#hDGfVxy/;-;o}kV{-1 y`M'aY"M-]}v4Ϲ58t,&*I|DTk (g& \]6a4 3Ӳx5tǟx5ĵ~ "2BdZCB>8H218}$TJicpZSͷ,nN? ~U3:@9T ^<;-Py5nӇċYBE 7F2q{} ۗ{;> 5“.Ƚ2v5Ø" 6@l3q&!n#"X^` Q,]p=WS $(༗dTs֕lc̍b(d1DX}ǐ0L 71 x;дɳP{ +|I-Dhdr?_mdJ ;A׫_sڸ1u]i+Isuw N/!;U/ڶf: [ewHڷlG5M1TETeq\)4ZeC%,YRˈRnȥH)wO5()? ι~OQؤhyyRDw@u;N2 BxNIx(C.Wn=at#\qƗvR`"HPqA#X>g~PI0qLj$/8m]PKK {W3슿 }zb3`Z (ryZN-|Nֲ!ӏn4I3i…mMbc$9MlSV dxȄ:P\DEHYrAqςW )IӛC@WBnΖ,{I߹.64p:pbX0;M'?3~L@IKҍW[󳃄Ǿus<;So׻ѩ0qn۾w_ݾLm+OĕQ6ֽȴź p K.:;-1i0+<;+_~Ds/_}~|vW+: +MF+aNyR$q?-P? !dl2t|fvSt -9 `Yde]Ak4B 3FDN7ng1+&| X,r t?0N9zlX hL{;!ϰo/99*)LyTڠHjd-zȠ\\5|I5[7 PּVXʮPgN5 HAPZXbźPEY1|ߌӜd{pc락v~ڹi'߯n'+5:Iéhf\߬n.),nIX& FprZwonm:|Upw_?)7ދ̉| Apx~g>ٹkfk1x^z+WÕ݁}C]$gI~NLsd\_cƹ+43絖ən߬Y)FlbatWCo{}a ߬̌mK8QM׳g%pˠW7^ jYlx$_8[`i9̰-A%L.b]#xE'/' O…+'<Hbe-F:$yn(#'03H`܊ #VA3k1c'ٻ֦neWS|Oh\& \Ω[/gL΀{ c eF،]<޽R/᥿-Zֈ ({Jlȩ ^  jn &*ifi4m~`zuwAI4`g6+pϑ e $"9h4}Rx=jrJd~g Y <ő‚≛0,, cvf7&&)'5ja"4Us c08gH C\R~m+PxC 2O}f2Gt{0|",,R&:_1U)n0z~x؍q! *Xن$1`y|.qR\HJpE559v:w7G쵈hc:#R%pFұk gりgb[?~>{9hU\Gދx`o|zrp{?673LF1a63f`C B!GeV;~ Rb( bEی!+8 ̣0&d\+ME^>W0lZHO`{[2̸$ҚH E9}rQL-]-F\+ehp%mD0N2¨ XLhTK}\oӹ+7o{*%娮n杒dfvDYŮlf|ϲ4͏[Jo)un?}w9]ofտTs& Ƒq"|Ӵ$zX=8P38:=-0K_ nثaR_t5Zx~ƪmjbwÚYOf3kVX X 7_oS1׍5=՜Wꨥ`!1y~H3UʹdK>b9;Qrkq6z_ )U6 =jU5-͚ZZZZd1%)] ]w]*jyxp#]=k [#Ê ]w^BooߺtU\8]IB9ɤ PޣxmYc2uʵq'JdF;C@R &Cvy<eE]\QYOؓn}dw(=~ZKbRbMj\~V'/`w(vߥri궾ЍL`oՇhj_i51n9Y|reINǃ=kYZEBd`khѽ#v7q:Oi `An mndA{%f2d[xG[+M-y{:l#u?Q9: @wTPl#D'crQ_=[$Ok>f, ڨ% ޓ' h$VK d$7q\_n͚u8:2[Y`]MNNIpasIW2Xptx/SM5WNw>MZ^Yx=-;S7) ⶫGgTb~1VVmxJ,5d-21FI#,H}+|^0AlIk.[-Zt&ۙΪy^lɎ;/~7MrSe<8"^n%7;+/Y|¤ ~E !+&m{.l'Xc #*4F5 #2jlWFXK޳AnA؊[ Sl%JW5(ltv}KL\QuSQ: {v,nv~VhJRޥJK7ygb| g5x/5h ,1 'Rt}-79xm@b愃Lz+[XQ9Ӛ—-RHH1n|l.EVxIM o!9zke@A<I- 3e@J@T!e͠\|qm̟}7NBE*cgXh^2:KO)ɾ\P@&q2 .;ѓXh1b^J9!|~ -+R+xi%  b~ -kkǩ1Jwz@i-2TR%Z7Zf{ɹ4QxT~e+cyfhRD^WZ9u;E4yBh6,IYHxF˜0AɆ) IV':UČ$,U*D* *{zB ڶy[7t=S0GϾ|QW )iFMf3(R$q3Xh^P͖7r<9y[/4BrJWv fL mT6PjͥR*,LL8+W{Dβ599>Njֺ CsJu!-DA^nR$ƠcAxt[ z6k, 3 !"6"Zv|&%41qA[S2Z͉p4'+(0ϴ39M VLYFh >$gcb .g8\B F6ߔ$;R+4;b$ yᐮ0gXhtD֠'rb(dq3k,4Tao#HɃ&!nDe(4FfZgڴ(&cy*9q"Oƹt`=a_cx.u;4,51hq.n Al -7ڀ NDx&bA5Zݜ{ r 3%L I3@O]|i^UHM<*XljPV:NTosB 5 &E;5=b娱U\8'Ai\]!mfLzވѼun3þDG1?c%^qӣj,o9oe& sllu9 KJocZFe frR7k,4͋xNY .7Y1FL6=cB [<)LjmjeS3Qyԥ|9 /@RGo KA&ƲGXhdkѩROӖaEՃ+iקR,߼x>ɳipNψ&eO&ONϝ?Ng'tvB|3 qt=F-Ltvq'yNh ԃ~<7osz壗4a}19l1RxD#ViH YmIf25;+ n{$8KV27 ڼȢ"JN<.k=%ى3J*@6Hzfys0RؑV`RvK e3ӳM] KF%xE̲J\R lsA1u%6̃;.f$tS2m_.h\_h6Qm30%vcK}wZ.˵KßZuD%j?= !'`U׵lqw|& ;$$b qn6V`RA@M<.ŗ$QyG_$%ˇc4vqԯO@% U 7j Rdi5Zl7[7v~|ڨ[w3a7ZWy_ۆ$ /euʺÓj67լw~IiE$;)/ X>s0 7w0\Yeʥ0~O 4e)ŽuvKJ)?JnNh0zv%V sk -q,yt:gCw*G` 4o"ye봦. 5( XcW c?oƿ 3 jkt;=(/il'Y~czK }Z3Ei!Kj& .F BT BE֤d:vR.u `%Y Lq]ɵ"l`{`YWQ,!WI0q0)`D7.X|Eqi#c:"a uModžule6 uKł2l1>p7L\M*(4޿7(C7_Ս%Vsw__kZehj]b1atw~2kK⋁_CEG2.~$ kZ Da|ѧrGOUVBC`KGaVXm7y:qϸGm$@2{\Q }=j {L3f<13!_F{A0<{Syv^a3]DHOvf J}S)̡sldd_5{֕YVPD13̋R͘?e)fHFj;CՕffyۿJn6] i~h T`*^ xazOg,=q4\ڼt`tCڼx:* /23we:Omx9RRIe9r{6t31|اwPH]C7I-}!Gvr iMǣ+7DNk7ns(,p |,fN |_x GpC \H4_7}s짏6΅=͜Q_Z>) Pzҙ.o|zlZ}8 ^zbZ2n&[{6TQ7<9K+*eWh $]ǝJT箝V46'ój]yVksuϢ{҅iko_拴'->0q3#% o{A _Q "-p9G[#<ʱWa: d#` -lm( 'T # }l T)0 mhDJ` gA[K> -' @K ,G[73%8mu896@:Nȷl8s0` iKpckӓ/kmp٣$"T;]wrX-4 1#Rx0r56 I}ިGn 1R6rvl#À}ABB}}@1 UY=|x:ToNOP.D(Q8ғEI9ۘ8X mvL_Y;nu-.]KrGwRw8 "m= H`]J%alW[7O'4'~D d`Y 0q2/d!]2qqO !x xgVģhY3BK?[FXQ0\jIQZV¸ j?rJ+-}Y=;kkjnI$Fć˃+tA`XîAFqW-@D@mlK7Vvt>>ilCbOHXzv-s, ;0J6{"9b%^U^oGmpzSnu[͗SqZmޣvFEvE@" o׶s0]pfAEnu(<}a/6>ļf5tR@G/ ԥ%drL?tBu-ơUV8nD2aP`9<>ͥ,dYNԙJ;Xh8T3xcAI(mhiif\ڵzG͒BrPrƙ.ڣT*HsH/b n:trm]ȱ5z}G@0UEx O|a!Ԟ) cm`nsȷӾyԍtǻ_6u* Y,vUr!4-I`~;9u7kZ0mzrf_y>cx{v Ng+ IHSP|Qq;3͙@N'K'%ca%;9[(Ѩs]IA1rRTr+U:snfv3V0>k- Vީ 2[)Ƴ,Vؖ@ sfb>#`6uVs6q?5-'OFK hlZԝǭ+kA|[NJ].GDwCw3zAÇlQ{0L5|b]K,V6y.>y[yn(:2dJ &77v; ``=iS@nzޡ4+q?&|6'yȥPsO!S=7'/2O0l$f;=*6v}ynҒZygg1mq/~뢣)sY9G?6o>Sγ}P`@5c2屭N׈5%B1(1Iꅒ<(AyRR^l > dؗKGix#j]DglFm 9d<Z``0 r zpLBR9C=$(J 㧑^ɞj5 `W[gqT֛a_5Ta)`"t`h0,gh}EIvOe@J]s8WXrI)pv63;ljgrs "KQNƙ %K2%e[6\'huxU' ] %Y& 뒰aF4WV KƒM3kp 0Jsj*@DIdGa_0rC6d,e9/) a҅#$ٜIL(#8& R.$ k$7=d =D&؋<riTät"-C}?qvYnvr5MJ|X]h~&o OjW,%* 䳻 [qӋRZHJ%DK.zNw44|>QVxs*קI!}cŕ5`s-fA^$A8Bi2vD]&wÂ+rl{}|Ѥ ,n"v"ɗ^xBo#_"޹bj{,E6S)=7i}o70 )7+@W0־= S:1৑ɁgM`Pv2"od⩚ÿ/+`+*XW,oPW}mUz[0z Œ?Ť*~ Ԛ4{aօc`20Fh7Bm=HzJGET&uOAxl@>ߌ1G*XFl)Ýr(H)E^)?Si(^6B)2LPk%U`@6lF)q$DR̰FDStLpry?agCH ,QoYHQPBBfT`D$iG7qNF_.FpKVPaIW fh NK)S @1喒0ll<~[83cr[/8n~"'`ZF_=Z͇/:DL׌鈠 yy l7 $TT281 NfE,i32I"{ikkSe>B\+g)RDNsƌƎf:g. YAs4‚SEQa7979$I]o !{ukӥO%9pZ X39r8c=c0crv4/0 Ib֩\),e~vV+LvNmˉێ\"p~߿~;j9-3B\{n b]V`(P{#fA[9s,zG#:8 \.x+%k A֏.VW5i9xwmQz"wեI}Iq}}Јf 5T ޙ^Neͻu˪]UuƗY)`}l^ܘ\.'sR}IRͭB~ nnJ~mB1+N}&#']lxe=Z*קUϦGVkEITkjSO&]GD`xjOjlsz-XZ -==caƂ/\eIQ]=/|3Pܶͤl 7 \U9g]>MΖegˈTXc4G9$a!E=W!S{TOYu,}r`jI:KnȒD$ؗLݙXuM_25!7I`?u5fHR4eeif2~g¢5׸'5v1})PʍڸgZ 1n 8%-HICmV-BvomݴZz;i nbát>DL׌鈐 j]c|5ku-HU;XTRO3L 9I1lom,ی$fFH:t/z:!.F84fcWe:u˿^uVD BAG:im=_n#{o4v0 #0Yh 4!#MV ~WR#i޹\pV\Mt՜?iN²#̣GMW^ؚ=m&gf4wcknozV9mͲ,{ wM2㿡,$5Ϊ?1TKi`%S>Th-Jc`@)cH/:\Hdf3%R82  3 ZI%H3 NP%@2An'A]100׆4`uIX۰?#bȵV Kș5E%t2CR`((P m[c2 Rne<˹@tS SpD pPSa j<>ӲMC8oڦi܌#/DX!&2'"C(A> }_vg÷rDutTicmty~֛BfKɀJ$h01 7qWE?P96ֽ>N> hREnzB7L;]]/}\ #ƨw.ڞ)Kw zlE tZߛ n3 EPCZd2 fr" ڷgytPcq|~~x:-36A/'¤Yf:(=cxzC ؊!VUkS+ l 2a 3𛭓gܚ6=UPzĠ~QNrI6.CIl*t$I5I szfzzkxG7`a8J7 m1EJv#+uMݥm?)B*ad c፫bf~cc⎏O*zwl\Ͳho碼b]U]]F҆ս(GP (NQ*ml PWiMz"m0.< D0o& 2oCǾ@$T)u@$~ sP.&n v\JMvWRÔ' [ @.\Zڎ+T)dJieO°tυ(mB tkpr7kCmdL L]`JSB\+TZԎ*M]uWCGBʟv/B@%#z@,a-ȨRKpy4'WxZ}z* ot \Wv=LK@!\\J}Bmq$bV]|tjW(W2_pj]J{\uWܢ{@0\\|ԴWWĕ$\p%q-z+/ U6QR pg2r7kWV޻B:+sDMWWI),~sr:|d_Ɇk~6$!XơCbFL.M" ud>}^`]!ۋB(kϔ0,Z^Y[`'7gX7CLPQï $:u"g޽!J`lwuT|WC%P9vHB9> 6Z ӄ -X7kQ@ %f򠬙iV~A =R/seYr`Q1 EBl؆C&YkgrҥETyڣzyW8ZT~7Qer^-jECEU{o?vaA}Wa;{vgUv<*)9؈yAxM/,d],%b0ʂ5>;1VNfOs!Ֆv k⠼d_o돲{G>dqxDG*^Nglg4Ew>r`hsf[=kyAh6 ǔ*cRիp1t)A[Cbbͽ;>L+L +z鈮,_j?~\EK,%gB{(Y͋ n` ]le8{#3zX2atB/ #HlFa^b<b.,##U.T_6ZQݑ*fdkWx[ V\l8s#͢y7;ZFy*O,ڸRd}w"f|Pƺ51f.79P\rn69LP"iSAQӆ}ۮFm),INfKf{qr˒E.W4F& T))#+(e Adc}I-z-vH( %,J`ofQB3&LkHVUUߠb@<Ƚ}Mģ'l& roB4QmyB1Oy27`H Pm4}f'qeD\g_~u{&i@% 5̗nd#eh6qZOѢl֛3., fM0dMJVN>J']Z[lCzl=v^R hFnf4(ys ;3T)31 5Ҝ>e-JnzrQSkOtFJM[cjڷ)7pW(W3_pjh;@qI\1&ؓ;W(X+oprWv\J㪋 *#\`CLQmTUqoSFGsoprWV U޻"p']`M7BT+T+Zv*qA\)4ɻB\\R;5BWUwp' MPOV Z ԢUqeW^-pe\{W vWVHGԎr7Ba >#\ي]oDXAOS=ԓzrw멵'RR.\=z*tGoprW1 U㪃bRn< \\J/BԶW4q\qiH vk \y+wjU'RWĕPj@Srl W(+_pjO|8Pev{\= XpW(Wk_pjO@JӶ =W+?kW W P-kdU~ҚrPg2r7DWWĕ hgf}-N*ZuL8V+| r9eFjE뷷Pe\f4*]y캂9?iZr)qU[=E*jo{\rj 6 $Im'9IR[%qE\1+LW(Wz]ړ$?;B{+e6+k/[3[+TD B\\c}OQ%c=:+i > + P ՞[]Uz\=4TKp` J PidU~2DM=iJ PndUZ㪃D-ۏ+lPf+T)@+ZPb齫z5?1=`&ZU=eAZWվ]O)7y+e5+k/a- Ur㪃bT1=R{+/BƶWR㪃Ԝ-c\\/B TJ{W]ĕ`L3PU *|BR 2%h&((_pj-iR;lWO+Ō>J1K?; W @W]ĕv518I@dzu5dA0B2oz9>NЊ!WbzlvJjbEi(Ȅ6b"tigH#,d1yes3/߃b9a PYvOgf#*b! NsǙwo.*EpvHw(FqWg0(I`Ȇx塁(Jp3 =l<(kRo4v|ÝF*W –3y)ZԂ[T̫gl-^KW{Wl6lg[_]p +ގ'=cc|C -ob#f.^|=,@`ت4F+ |8[9_F ]zPK&KA:m.(gC688T)+s_y #0H`|ȿ-Il$^"^W7z ҕW ŋA6wK6>xg+5eCwhV2:YsEв\ O%uzls?MVk|zǪ\~fot h<εAe]hoCv7#pVx[ Vnaj眿Es\Je_2N._/oχ;/6'=Q<[MnϹm-(* w۵ vMr2-gL!Gf˞==  JA>&ZS7Pbh ʐ,J7,_,̢JMdܚf!UHfշYߢb_+ ts&Nea*Uv2eoΔT}g_xE7&j;pћl-Ui”*MܼhZN3wA2` n8sm YÛ7gAWgl)\Tū( |G 1}=Liu[n )1 ›UES@e~&LUz6 qtSgz%_^O-;Qbz*EX \Wv=3P j P%ml]WO+&`.< V\\}WSP * Piw*qI\)]!%)ҊK ή:]Q*ɃW|4_^8:FXn"{Ki 0A+!+UdE'k*V{+) 1\\J/B;R\\Oe 2|BF L+̉W(ysjۏ+T){懲2 VLQ? +W]ĕ%n*@%)Q\hV?\{ypJbɁa>(mYL.~:ӧOoS>??J&2&: #U(,BʘYhFR|K-gdxy>xrݡX7 .ߡ~9-wJ[yUc6Ҕ 5\b"j!iD6D7CkdBc) @^A1I=o8vU0C m_%]P[` "͚L1۳sLŒ{ޝ7,αkZCa1}TݮcTNju;IOW'Ƽt~"$yJl&I蔎9;}FH+rJ9iM(S)#?z*z2iTDWYݻ̧yL2K4{)Uk}װ8 ("1EO%:'?㏦_7Ⱦ(9j$Ezc{JVuiWqau#w/_PD'e͜σȘ/, qr< 9^^C: HX۽o]ofp7:],PX܉Mp_ ./$żݨ]Njyi: ukgznnzrgEa_3څ}G>v "yNK5L8{ތcwέ=M]̰߮ ~\Uƒ7U1[=l1m߿=)bھo"0YubG ?B]2m &IJOi o.o OyMmnRu_r+}"8l~s3|vۥ-rmvwha1ٟ}g8c{Wr'ò)䋾~nOEW#rs̋ _|~_͜gWnr!O M-paO76&/(0?[^ܿIPnu|#[v|Wwa>ʾ_V ׇ:΄/t{iő}Zh#?k`#_P<7<A Gٟ 'p+aRf7 [.=Tg͍n;]f<6߀l$'iAsv*ͧ?6޸$HΎV#'Dtv`hn"dՈƜja4jHV;/VR(8rBwmC&8c(tTͥ*ړwBȫ CqN+2DIU$cVGJDJs`#/n]|M~F&r'凋2ݸeyO7W>ܫ#|/rV\s)_ﶾKybITF>PI9BL&<>MZϽu]҄c뺗غI/p?Z4_4gM/ILlEDf(\֓Mr1CHricdYETcR0 T-JddRr_9넮궢#"ysvyᵻ#V G}ȏPȞ? wӇcӁ8])_o^^,_̷&L&to;vg*7Qv,vEIʨI%H󾫺/K!Ȗ 6D]bS(T6]4TuF ǩR㾃B}gj9 |˿OZg@~s) [MzdbXEc>!vnl=c:D*#luyё&݁e"; $D9}#ˋ8AR3^MR8O*8j,`@LJ;+*UIee5\Gab-i`ˑUK ?*6U`U$kfc"k4qS]+nXp^6w u[膴<߆z6" %˞ui+*{+d##9V%!z)!9(ꍤP~=m3EDzϘHZo^ 0Rҷ-z^|I*(.};!+l=j:q(pikszWUت&SWMZ;\5)C8 +UX[Wkĕ`I=jRc0ኼQ\5p&1WMZsWGvH~u@pv`ઉ _;5tjRj{WZzĂo%c_r|k^M>Mm@} GxXd{׀2: >d?NZV Vpvc\z!oKN1Yvl4Yl; ڣmh 2VmgW1!]^bo⡲ uem?2),e~ܓh&TzvUqcW}jPT`[)Mi8iO>е||c_[V5)pTQke|ei.Ȣ)"LlM>JeԄ >ZաQx4ìoK=ù̸V zӮ. ~-.EjY - 79B4Z絥dQM%HY;KN1E ILYkoSFS**Z=%IՉbc~I?4(p02Ro״pʓSZ SX X* WEƚb1d]RvPZL!hz"ڀ.jL.IiR GEUYL{*ǰB`P- uХD(U)+d07:eUkI`t$)SC)Fa]qh e@γY=20dYZH[Nv~Ф4IhPQɔ* #` >eBêVjErm+6^Dg[(gH>w̛CFc5[!AP2hOB x\&ڰ?n UeJV#Wg +Q:xQ STBfDoaF,);QF=qu8~1JuĂ,␴eE16e\j%&~IА)A`6l +T6P(4(,IZGY=⢗sHJTaYVX+!KˍQ1KBePX ufU(xGm,WWޥ$YI*e»`4.Z=$7d)\o xH,v- ,*0dp sVkN=8/Up;yq U <(EJ\ XtdKJ|`,S̕@Kjö4$, Ac\Z5!-CXHՂL%eOlgi(֚ePXaҸ:k K),6TUdq{ Q!(֨r(b[՛\MB3UAS%"I2$#< l3n AjKS1g)qP@6>`, sN ,x@ C9c4)l{(j;a`;0e&94H vfS."JAʂA*<UZl[{Sa*3!(0s8eo BPG, Rl al?@AO)! uJ> 9h ƸGe] %՗͚4,/OAou KuZ5`IX|L8Gq:p0G5;D^J퉞m>jZ.- p1!pv> k/UX za-!<"4N"eiyU!|p|R i. Kk4.|#xH\t"(Qkvsx8 [, ctܨLt`+U؝LznL8D@w 4}XX@[{v9U/Of85f`I%X_KF"G߁]JS0cځ^ @lG%\qb W,E`IWS׀R"e`w PJ%Lň@Q ~#]xdWH[2`RduJW?k ㊀Ap +ȓ WDNAvyM\ة %1<*!d'U[CopuZ?{A[e~ Qڰ(RVЎ'#9E6HF6+5|!rA'qɦ(A >@$B5%i\4'k﬑|޽֠\kTo:k /u;JpL(f {`93#@JfH-th5uD2|.4%7 o!K|>Zjk̆bUVmy b٥`tJPL| #!2l.Hq9- Fip]Y~5 HHTv#DoaL@v> 'B4'sqd2[YLdUUJjSZJZ*|, ӳV>ςQղ!}$U"?Z[JkV;7P{X'] kb.LVm@!"cFZƨ"JȨ:"uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2QŌ:Z3٧u1꠵?;h͙9xN;:ѨcuȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Qp:mZɨ1\%zcO:Ҏ:ҨʞQ:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨsF]kdq2u01Zި( #k4xꋌ:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF2Q:d!uȨCF1ܚm zw+>|LPv{Y>%nZ<`0 쑯;_z+BJQN+)Y g/ fNWjWHWYzDWµqm#Zo~trM1%k}Єq/m O0C |7K[?&cT87%ELFx;⥪gQ:)ɜy̅30'lŞjR4ΰ¹vMGË p]vou%K}s5NH'pAix~͋ X6v=#O፮YU᪮Tb:XT"N^8IU_Q! ^BvwYFV#h,WjU%Dţ1lO7:eqn5Rʇ\W uNB)p.hgS[џ Z%=A 0OFlzuĸ 0;V r/KW5/T EWjRDWmzGtq ]!\BWC+@)Wz%z=t%]` re_tpCW Pz^!]I*\e+ zt( d5ҕV #B? u/th sNWre;+-Q}*#`B^ "JAHWFJ֫' U]ZաDW4{>]`KZut(mճ.*RJMu>~ 4.BpK~L/ s:9|rM{Lu293:Ć=zīXs^E72>"JmW_!z%=+l|o :| QRѕ޲ j7W ~G/ ;U@WMϕ]q_|u>TҕDW7s+}_ Jt($zt%0b•QW#zRXm{DW=}U/th-;tBN]BˆoKAB+l8}29 6?.|oV$me1|^mq69/Y }1[b̗>YHpiUV8; 'UqݹG,.7.,ge,üT%\r(bֱJI%+9(VyTݮ,홯ϴ~]jm}}|1]u_-]R {/Ӌ|'LcvMw /19_K~Du XPɇtZhh~GKJߞv{+A&9X7:,S"xKƛjD׺K3`_nۭk0+Wp0(]{vZK8ueyAX_j:F}-›6~b<yuC_NFgW _9N5//ƥ1ҿ֜2LGx<ן j?>;餙P7麣Vjx~gI2sa"3t-2.0mU4(x s(V|pj=gqX|1Q```mN+b+9tJnKay~~ au-(ٮ{wg]I[ q_Wp'֏Ճo~ҭO92ver]ו~tF͂ԛ3ZQ9r)9Iĝ~K]m'tˍ"-<<D}ypv:< wm -?q3%/λl[=.umPiZt0&'bvxൌ*Jl2'jgב\!QKmR5 ]'崬& c&5}>- sh h0idtdAaySwmai<< 9k$aF]oĞ^}ПVM#8!2㦤i5+YFjBJ+ؐraGAW]xe /?:lp;|S-2VUnКA[B념6'B7WP /VWdLXW:zZ)>YtBgWIíI9+[jy,D:>[+JKatfd"ת$EIyYn6AhnnowݪISxLE?S5ѧI1u|dlRZeՓi:쯓QY~K[謅e'CWM[%3ַ0΃O媝4;;Abi ; /_s|cݥIoTAd{i؅o瓶,A[Klf-kx Ӻw=crWV|\@՗2L>J]{z^!}ƒʟ >t2fcoL}"`=Kqĝf3.8:gG'9)gqtu`L9;*#_[ ig :jBb TAaD zz:nkv2n|yI v.9yhy=i?,~[w ٨y4=:KJAweᢰmN Ԋw+2˞O< `Z~1:e@Ie F隀la9]bh~4u;d9ƕS[9F*ƃ2dsk K1(h\"[3ɓj{CPjx*6*Q,spU4](pg,Fʩ{)Bq{/jW#O [@Xgnpn?e4gR ES)ED䈬>U]ujdqwNPmk5÷(m׆FY*9:cɸSF%\ԒtLVkf|[TΚ1c{=VYfl4}6c>Rɣ¾\pvEinG]?KA/"$EM0F.e4e ; +0|>~sܿSz{VZfģG4NE`' V_hp΋+8FeUXgyM_I ǝ) ݕAOToUW]i-Yfö` ʪzo_ z+w=l%+H,[.wO Kw,X$7aVVSf^N\z}P|Xj7-?I\+qn҂:MJ-N]ܕKϑ"#wEH퍻*rwU=4]) =LUO_.ӞV r;Z̕l< z筛:Nc& 4 $Dɗ3QBRr2~g*8]}+o߷V[*o_}+o߷V~[}+o8V~[}+o}ړ *oex;4Ngʺ|ahf^x<c= _*ce,RxV8*Km0VR[Yj+KKD@x#:LЩ9D!"'nDaIIMf#$4Z$c;L-jp>AC.+Ahų2FdTB3ufsơu4/Ez ?}dZd1'dmMK} 29$[ X1֣julT'!jqԴ3 *d%E@cu6DW:pH83!RG7J;ӉRz|?XE[eD VDUBZXc)LqOBZڞW EHUUk.#B4ܬ[?u"O5>ձ-{pr(!kH)=g: HcȪytgK0քS4֡;}IOӲ7zD4%⒗R(r[PkA6ׂ!qÞ^}a{i9u:bRpV:3#6NڿpBBeToTo`~ծGn5n <7_m> @[m ӦdpL _*V,t,ƅa3Ni8B:2+@FD)TiZ92#$㹂wATH'U?޻T]Y qH|rMM "iydS*(ڤ’trfSN˃MjKq nPj1~)׳ƍȄMzAQ9V7ـc΋e<5&j\דF]IFR*u)/lc6PA=ƘHB\߾o]*|4kCTe43%v@jHCt17g^'t{Вp_j%$Q PU0OF9$l㕷MJc2潨Bo5RPKPYT!:Z$t\Aрe\+iAcEݔG|q/2~;sUrc?^l=b;JEⱭӶ+ی2@.'CZJƻ\LP2鈄똢K04 1ӏ`A$w@ƻ%a|,gAM6~{2 EV'c'y1 .4*N;3EX2j P2y_re"-s3V "9#t`ƹlǝ5fksb2qzE揢Yy: #*'R`dZf9.QDÃx ,Yg8OHL5N Fv{F|k<]c1Q X:VӘ+'i\;N34ctz㮊!F i*RluW){䮊?ĕAWEZO])QVwu^]M(닻* #G7k{LCg2wWEJɫz1 \z`$v>\y]&uWIkvR1vb wpWXS3ͪGEoU/Hkթ+ c3tWi- Peu~Lr]RWdg[}ҊgcEI;o;X,L2ȠlADJEX L!R;p {>o$&#WBq' A: Y9X1j.-zlL̲R58r-hS%G1@WlEZcN}SeW*8 9$^!sH:!sH^LW@H9$ Qv/j#n.:M{uڝ~0#xw#;C T:M42}x;1gev:muE ::Ƣc,:Ƣ53 c,*̫c,:Ƣc,:dJXc,:Ƣc,:ƢXX1uřD\gQF{ӨJc3&ި*K5gب*@pĦEfcG9 ]t a0(/,蔪9>lHO&gDY"gRŵ^g8C(a1IQ &`H%=3^iȴRK ;㌗R% ؍hYC2g/.[H2pcOi pnnë_QNf&WoYmAhY~6-!d:!t)KH00Pq&흭Qc9) "&xb]04-UHkmXx6K@軔 XP"#CHZbnW{l*]7ѨcFAX-?!ZcK )kڔ ؘLv&)E_ǡjsmMRG R̭'4!c撈,ZQaI&D|5n )}Ty()ǹdlNB 4!9KR;Z0LB-p)ef\Wx|z1 ~,p<-?`F%eL=R}VT??V|M*nbӥ/P:8^Od$%Ӳ7k}bljvs.B]~cӭIAM\%Yr\w6?Pn^~Sj>Di8@ni\jJ*/m^,t{ˋwC8 /ci~rH|s/ZM,[_?-0b+;g\2qq.PKepiG_/3S>^m 5q.0F 1"LHy`)nJ: -bi2xr ~yEY*[_[ڵ~}h?l4-U mZT:[S Z6Sֺw^ ,p9 -\і"]~ϛ'ӛ0*&ho!ݣX&wn'x"O] O W{-qBPb\}RLI cv OnMk="? &*xHAr$/\E*T 똃VѢr[өSO#Γ/YeΝZ@4?_hkC,wrc1 PLd\)NJ.jWh/s3gc :Epҕ,4DL'" 'J)Nah,0bb/Ӈo'VKtb5+Y:70mfY/,+x ^ =؈E2H)M?7QRBN7Y$рd B7VSü fj,::#Xzad0~=n^_dIcə8~Y-Yٖ%`2fwlad$R.+ 2ƨzrEYpogo9≵XR& oU F$(pbH1wKE嵲$$+r0(BP\F$-w1cYtv֙8[@.nQTT #IQ-uJ6ʄb ;yu ֊V{|o]痢 Gޱ[()T ք%mDWRZې(Ql*)ԶZA8qR'!kqB2&5@H͋8Z^NeUl ΄J8q{UzܙN!**#bt} t*dX"R-$aQfgs)Chںˈli\ƴKLEj|c[lQKVXYEr9* Rب'’Iѝy,XǓ}4a;ӺF?|V\;Ҵ4C^B;O $jUAB_ :-: PsV;unuO s$X`'*⮪RѻW讬RrWF3u]d]YŊ+ ;m@I^_/GdӻwoƏom+xM߶!s̓ +~yi22bv:۵Z9Q /t@STt8Ǵ`w] 79`mϺ},bvNV;O||6/6e}vEO}{VkۆtןnO#t;ŀ0qºM7V L&g cׂW¸:D" UhL"/"9zuK hSE 72BL Rx4SԪx>˄;2aE^>mys_=ݒuysoWgM;Uw4H +Q L: :>O)  {xɘ` wVЕz3x.k(Zrӕ;JW4N'=#󱃨&69݀ >F;(P*u=9׼]!حxGM>'Q u@&*e䜋E:&#_+] P2s^nq!Q:٠l0C-2cnkO!^hk9HvmL@-LyIFv4jd3|A@\4V0tx;{gQ3u8JCIdO:dp&K( DAM0 Tz* W/*vl  t$ eRMPn ׸PJx󨷛kyLW#֗G5x͵K Ɑ&{#GZNԻM[`Jͭu xO/N\ޠfEO da=Wf{zbߠ^Sm͹E3ݝ"(!/Ji v57rIu*|>aQ:L_.x~:hb`)'DtHBeؙ|OŮvbeyKɣq*gZ@ ){mfGFCZi2To-\:-1TNւv)44E*,Q8K ;n(ɧd  6m7KbҴT|P5+P 6$),5dgAg// "wRfLzjDg:qC%oy}n㭛Pwt(J]moug!橞xO9pu?ay%Vѯyd0ϩ4:$lؚa:r*-=?Kg}N]b.: "4ɰtFj@hsʛL ^F\(MƧnj7S`%wYy]xsx`7l}iGY&cƎc>qH!m;ja0t'x\CRF$be&Rup,`_&(˫0TGrdbD4u&LBxc!gWEWu_ާ: dD1/%XH2^aUG_n>Q'چT(u-JM..G §"=xU%yo}2MQaKۘJ1O4hxRIiL(]ǩ`Hl5w4/Ez ?cVQR(d K,!E=-$PTR ) m:Zq6>NZ~' (Qhb1Z ԼHhSWQV( LةDZWE/!xQcTҩa\H!aQfgs)Chںˈl繩ܝ4Tg:6-CxU,@VQ"\pސ(6*Oh粰mytgK0r_3 yu؎þ$Ŵn߆ֽk'CfqK:kD*H}vTxW8-/;?vTbM4Ao@i߸$Lc"qkwBoEqUD EC.E6Dm2,IEy6٦ !od!!ɒH$v%ΖKo(;2:>y{ò\xν%l:u\Ast?4!FFzeW 9+җᮢ/ފ+4qx1ֶ+n+m mk]"S :  G; -"{6l|{!}U(VE4;ʭD늱nUѡHKk<9ʖVL$ȼBrϛJ9x+l`yoҪWr{miC* PdAT$(h= `.$L**2ReCoHCr~1d|٤9'dexLe Z O71[fX#?B=H06vA;`jlQi_kl5MY8UdMJCm~b}uX{Ƶ5av3auQٗ<lyr白/eo >~U4zUefiU-ZhӢ\5X`YO;uk܀ljlA Dg~Q{7y0`XG[ sۃ4=խVhrgD?!-8z5WՊd?-_-VlAIBy:3˻b{JsY%XcاkkTS, hClC>DpDRS3ϤUW.T!3wQP_r쳟t9kƸІy9uFw~l54!;߰2ZXFRnlG5v`>׾V a%BE(dAl)Pi&)"B'# ݱ8wR$$3p:eꛡ-oß{ywҌ^^s'Pv@!0Q&}0&X\%̡L*&&^̄ һ#}s[Cjw?^?O\vd'Sqpu}ғcP#T^_/Gdӻwo(?{qBaGCqsqcqCXkRCI柿R3$%ΐ9~TUw Lr+-GX}Q}W>CS 7 y+NI/4Z_flE&5ً+Ո+^} hqf_4[<^0#0;o=&B)"(y *ԇ{ _(`h7#(w}I?A>\&6ȶ3a9yu:09Tj)a_+Lf]\J`BNG+s{>bгQנUb!%dYz˯uu? 6b[ψ5tTACf̌FѰk}X~Yч\~G'{|x) T@$< T*Pi*Pi*P T@*P T@%;)UJURT*UJ-UJURT*UJUR7u%S=JsMesLnM'*;ƛTaFwg^gI^.倒zB,\Vqq e\eڥH1:Rma2n,xkhhBs' FǍٓ?H|0gUE*7vy5wkJDs #1 4'D;gBv!zs,t)XLaLP:D p lSa i">mB8 $5+dXKB|6~]/Er EU 0ʳbx5#szUbf7I+]h>~CQU^ c3Š7(efC鐙]߬˨Dhɥ^~^i{Q-˩Ml%o&) N-^+[\CSlGrI> nDZ X\`r_Bv7j5s|)n@ōqw_vi2zB7LΑx㣈0vV:"7J J1u(µ=3)aſ`v`8T^<\*pu>1.0.3c,> Y01d~ƔrFIoWˍ"3Qvxk.Mrj:5@: _] .;LݫjENj?5iˠT0 _nAg@G0̙,Ji]t+~ H}#4Yg#72xqqmeLR1 ?{%/TQi =~9Z?Zbp?}܎K5?<mEC(*Y1Y2smW-&2s頂Gm$u0ཨa5|0XmQsuG+2<܏[;qM',ۏT]c;p &,|ש[T? $UsTUڂjjtNmT RR(םC[o.%õR{QXN/ILxRrA(:o;cƌLr佖9&FSS HmVa+ӧ%Jl YKrћr];4yRf56#-M%fVٚz98C66[syL q^)Ln]Fj6&ڹMmS4[YT-n{.2ȹS[^́ٯ~m3B># ٵz>ttju%jY.֓$ (8FB=1Ϭd0v?L&7o~3Δ9g[fQ$RFޝJQEO+=Zq;΅bkV I.2FA9BNXsK]`N{< Z ofq(23zјih 5d(V8dNcڦ`hdCex]QV%nq]⪋c<0 brpb#Vpۈ!jEN` IUp;oB!gh՞k:1g9iZ+Is`瞰U@Q{4wNZN1k<:[g+_KdӜl4|zǂzo-rUcr e+κ7GP1LQc2eB +iF%תĿFw756l~Zʩ6H97[ȝ77\_|'E߈*H@Rm'O- n$>g-L2]FMWJ*X>:˺k 2Ն2pȜE&5dd8k٥f֬6 }~<dɈFP \ma?/7V}>^o-Vى5VcNFr }m ʶfNNsC;"PG /oD@&{佹'wɐXa#%iq(iOgڣHNhOK0N߀Y^/>Zݎ/y*OZ+tѽ!~ŞEvkTNu$)o$4Eϲw xh߈@{s8 g=ᗠ}s'ݜ #{bQy4sw-}Q3BHrun?/{v~e䰁Cf̌FѰ Xۑ˾b|BzgһMd<1[ \zţ-$jK:~HwN  g;zSV^Fb7r`Ut%oN \6(tѡ^}6VAnzOFJ1vcu58"KI ˿7ȁ~\C_8+{}/j5?@!|g0${ X} ,?Ӱp!7ښ.T~ԏ4;E92bzJ2%%8ٻ9JD]{X3a_]ZVZoE%Y-<{/ J.?VTリcE F-S v:.:AeYV& *VoVv|8xNCN`_.k)*$``j[oO}c+|& I2F͔[yDd!ڍLGz ,Y/_#`Dp/KQїTۤռ[>YUhIv{#"* a?!FKqJs|J9$sUmO0bQ,+ɩ2"rL ` 0~$RHڐGΎ%[f\m]ﮙLɵ7V reSCؗp Hǘ̩@=$(J%¬01ɕ8HXZ,@˷7{VoQSJ%T`+xغ)K?|nZ݁# g3.Cm;#m w}›. m!̓1ھ WIv. [eR1EAp6w1TiUAZ~;u8S8 C -J#686HF/g)Oc.Wknl*M{=u_F髫TSK 6[U\aE 9gL!2 ,%& }[Ǫ{xjM>KAW!duZsǵ#ɹ9qRaLaשKSg2c KI S"!&Gn,,xGIn6tl8'N'͑cq|ߵxwYbбݸDy~顱/_:@ `ϻ7X0cN86"97z/0"}@`:H`F❣vgb>2E_R5m@[b4dɾ7to(Jҽto(Jҽto(}@%u4FqNIg׎3f8vXGTݱl [Ul(,&ۃ">Ӡ<,|^,V}fZgR_ݸi&>egLD!e!x0k5f,`#1=65gzɑ_It^ ٝ3-dW7)ɒlY, L1d`0>#T.0љ$'`c/;Ie1Q"i 2O},\k$N)}|R56\bQOpf)\nU{kZ; o%1oTv&kRBrhaQuH2Bd 6X[J؄R؉r(ҵt(}kV~.C=l[yzTI.k@ pgy]Ry NBPZ$$&D]KaY=! KB*9H2Q0Hdi/QעBxl {b)2VddhJTjIRQ5D%Uu=hDejEo=)\=н/vњ'Ό`0N@hrlHrVf(y%雚ƏA2;,&H]y}igBnxǩlhڈ'Ok$|d*.iDt ^t|:U]%tJLW0 ^+zS`,zS`,eY ~7b‰y0/Zc.{^z^` LP #|+n^ ûs? }1loG\,d'NK9mG+$( ep,rRYșѸ%o[ۉRx!+?E\]ˆɣ^g[ٰ{_h6kdy|}ɝ`Ӊޭ%tG{/Cծ7mHL+|v"?Bn7t?sWL1*-|6ndmjMu 56#z"z}k֠odʹ5XjgIŸ='L=سkD.BvѓRCe]f.8sl]B4}ߢ?bG+G+U=l(rܞzxmirGWbOnHN ͇*:PHS\wo4._5>+_ q9N؏\YRm5N@=*N q}ʽz陖lm0@BeEbP,.U)2UkM'6DB-ME Θ GU̺ebT댜jt+nm=-3'(}xG^Ms:IV9@֑Qz.EȖm &B ry?q [v?M;ee_v468Qkh͛=O6Oqm6n:Iΐ l.e?Ott>t8њFgϬO}r4unݼ٧Ao|2KJnozwuS9<ǂxN_dS{rlRs5ݟ>t&?OU:VPV(OIKԙgEL:z Z̳SV &QgEr;zvz\yHH4 K xb@]PVnOZ,W3iEFTC*0tT.Q@4Qi)]ޱuFΆr' f .bTYDX n=ODø`%ߍEV[DK=,)Y")RN uDN!RΥ GP0dH!8qbjufjqIO $Ap:B<^$xΆR%Aɤn21ɢ'  RnWU38/3xQDDpϕΰC2YH2] zO#ճ`xM[/REں m7ÈnlX':iն ăXŅ4 $$SeXL:JgGBZG"Nɟ>{[P)a]b|nFT̒ڞZ"S7%x4SIjMBBOx:(Ѣ DEWAsa(dIƫPx8.FY5pj~\#JT.MPITJLruB9a]nߗjDE\@H84QR[EsPyfeGE}ŒR.I6ppAS\l&E&$-p.ri o@g+b+'z<,&mth?WF9vCrm5T.9vʬKfx-(]l>WC@c?4GIʅ[VO=|PAfPfvm>Ǻƍ!f]%/M1w jM)cMKsDg-^h:LM?Dї8_"1aջ4y:?4>y~6 '}7Ki{3'qyFr(s[8#akrv$3;Ϯww6N|{ܛF3Z@AM8 A #@{-ɕ(L3r_L" e쏛cKl<2\f]nhy8?eƽQf pӤ/ZZ.S ,Ti`'uz P{>-=v5CDe[H uGfά,f7G6ڟ4MY ߡy2+' ڠ•Nr *ƭkm TwQ0bA1<ٷ?U߯0~JB}TNGtCa3̷Ԓp4FkBG C?/u$[s{㠛y=7՗f*]sR5Ab?Fgޕkw35EW䴞Eyi^z4' 4IH @O ,E$H.s:~ν?ϫ@݃Rl * 0Aam%_;|6/9n*Շ+{Ǖʏf[ %2.DL"5` clB@2T绪J2S }(Ίxf%Hkeoް۲^ʌ|vTS *Pȡ9"LG Bb>:SkT+|r@EՄ 8"8_dy6*h\^j|)$/REeZg붆9'9%n6l۾ B.l`a{j};7A &Zk׎3[KJ¿8Ծ6* 'qHY J`,FK5҂)/f Ԁub hf Jq&rf&e6,Yֳ7;#g&"cµ{iѡ튪:qKEM-K&#wѕ>%:O(H`ZG r&Db:Io'; ES0M8#kb/+v!V}[u )&rxɨFѳ<\W7Ud̓Ƨ#䣙WpǗj.hzq5IzBH/'yfurS;L`s*⑝^C:mnt ;6JP %&2` ܙ!Lڹ Rp\cOvN`v5ja]_۟g]Af_/-Ɠa7B7&=Mj!;H]onAaLǃ,;#׬+~KϾkȿ* ?`7fneznC<ëq8݊n%4 !Hsy &qвv@C.//za7ܒkko!m6\[` 42xj m 6˽ -x^u7A zd5pj^kfE4 (iKǬa#%Ѻku  _zGYB0>`L8Kj5ooL)-GT ZoS慌cjݑ{UYoa=*IU:n-]ՙ>a9ʀ9=0"AlAG͵1Lkm#I aH{%"|F?ePi艹R$LjnmΰWUhA #(H8Hn4/l0[ڵ}?~n-v6 ;Tg)I>/ی.Di#VF m 4wF-mQ9b0XJBԀDL"qpAOQѸ& `NX8ga2f[x);SƐl)>qrnrܩ_Vs ՏUH!WfAJJaPK|ڛ\x xCΡ j|Rnp>^KHZ\0H(ipblzO?'Hs;3?W1&Es;(Oh>c HJ*5f<Z=9)x(܎gpQz֤K җ"/wRr2KxhxǬXPd>0x}//~4@a@h_2O8G\M5'~\GzP7ܫ1I sU<܆0LIf؅}'5a|rl%g1<%:>nݽI"-Wۊ{x~xV6ÇU dTHt;Z, =.ь`Q$ #$%=_K]|-*V2s.h !~> +r;+96SSi"wqndPyP)`%{QuRԖp&*tq<S(3Dƙ\^H3} /=p;pX`ñ{'QX%TϏإ )jEeviM^8{2;oét8[`B<UX-zg՘IDk1hFs+%;gdх&O2# -, Qoӿ|0aǨRIZH.xh6}6 gchupvVإ$k{(%vRd3 /e_S=P2zEx#t+8>jK ^Ȍ ^`A ƜZ؉}-!kWKZ(!=_oR}:St7tͱ͓_Phe4,gy 㳊F#Lx.P!`V0/2ZDJGt`ZD"rRu9yV+Ұȅ4*T0u4HP&$ZPBXGmcB,YDz}E:FHH5J: ĤH[XA`Ir,a#ֱ.tlK2kZִi}jEkXJ8V]ZzU*t8)Cs߷o,T`Bi 4C,lo0v@6_0"~zٖ=p,S1=Ld~lZ\[,$-{#"||o0EV#FKRJswP)#>68Wѹ;p^\lZS7 %.]Åӧ#SvAK 7bTk?ݳˍzm#ybky)@Ȝ ԃjq.ŠYac;8HXVN WjVWliʣʑM^0,ѨN[QwHc0[ Pum;;]'r#d_/6d?F}M4{t:I״IgL=ŲOy-N[1>t/D_~n^ږ/N1ᾌrnKrf0֣=HwO #冣Л~U)})7zsrv (7 1 4#h+k"FC_4 (9^QLNTO1\4+GێEO}/B/jKfDỌ:M93GR^4Ӗ7nøZJP [bA̮#K-~]93`ϴ/Yn=gllM^r&)&_=5U=H}P=#G{^ :1tC_7ӝԭ&^`B/eD/twz*et , Y S.1=I}N7{9OLPutrBVTaϓSQДY4;n4ARLSS(x-I 8%I<4(N(Q JFRy&X-@QI4L LDae`) Z=Sx gwuI4()-ԋk:W.}%_glz53ɪWZ$?Ora%1U| bʽ8cE0:C$y?x)rf`bXM2kܧtSZ{odڄ Z;Ine)2Q]45eyոbz.O7\:? й ,I^{|0 cj;0`4/T7̶?AfHMNh*u/'GxnOfYl4\aeIqNR  uRFCD)'s's]T9b/5\#살;AaL<_P$NiXpj'3+5ܑL\~Ⴣ_wxWvPsRur%qFuZ.wx#YDCO! 8o3ԧ)|ND*&/~~I||S!ʐ\w̵}&? O?v($j ?RhMfUFɞD "RYz^  g=Vonl7o  %(Tm2Xͭ !\k=ɋiy;µ#U?]Q*$;4W~0gn:ߙ=LeEnԄչXZTaej8xWG9:(t60ͬʪ_niv͞6G7wE>vܩr;k5LR0cq̱6&t߹U|ݴ,x+oy^%尠J /R\ZqaѨz`ܞq9SqPSg|>ƕUnf]9v:Sq/LG kaj3jX$"﵌FMFS`PHKV\ݦsp/f8[=\r`8M- u3},q7JrTW zlW)MҨh&ؘ=Kܞ~VLMis w. FWܛvNT? \ZjHEBE}nuDos6U;dwM3O7 F;a:7;;d%==$!Mzzko#k!ן6kw{ m~w=G脹 䵽q-_ϲ~\*߿ztdz?]jx[Tz5npn7tD ׸Q*xVIǟ}a>z;(ӏ>CW7ɧmF,B0#BXYLd>y/5}Vf<w 3+sͳEOXY);}:Qa"mWf'}zI6n)~kt9Ny;Un-Wn/s+qm 111u_~И=8k M=Mp?~nTl͜[wHMrq X2#)R%]*y"#|0z  )c9q ća"p&c;+ ̒EHjddz9[u0U %+Hϒ $'gIױk"iINTϯEC>k-2Z`vRgnlK} 2!] X1; A'uqJ;cm5A{ǠBVR: P,K/C-R`dLȨtqTMZWi=XF[( 蝈2fHpR1y< =+[K 괭+'Ǣ|sO|n[tD#v )B2$q"e3+"{t8aHkudz 01z њƈ5(i'.@&ؽQ}a4kM2t_щ6Q`\fWNWcJjBI0K=(Ho=ZKr:mB\"wh1n{ 6ާ^qtޗi؛Ҹ GFIgok+H]u>4}]oh(=-X[sßx_ ZfH[%G{3}g!C!-PN.%LBb_Pk-TZ}u5![}ӷ@̬,ӶR`9˝F{]u͋94>(, !0Ω\ADfYpV`vvr%GIX$z ,c\:chM(ol%M˴&jy˕xwP՘ms8no id4HȦ?{EU PesWFځ+,7vΫ/X'-& F9T!2Z[xEt:gD-8ƵY#s춾/q`k}>`;k5yV2՝Ϗea2wP@P+ zwƬa\=`lFq5WO[֙Qq+s:sGgZ"p'c O\j=vsUЙ7hDy+xj,BrlN ?y_0*a~Uj9[P ,yPN0"\*"=r@_6RetɱST&=h;g@7T_/ 3\NzGb;}Z-~{N(ŮI}wu<,uAf'*|^ ~UguyTWxc(RPSxXYO76d\gN hIpzy<) elF1+ H\K:S{S֚c9 2 kC$2Z@jۅR ֢\Rc9+}kq 27wZhS!7Ƣ9 a\&a=#&%He2cJ\%bHo]R%!MÅ8c.59uzT7W5r:O~ISd#1 Q:cő[TFɎ )I1 ;G.$F7)M8"1]KL dc @Tpe!iЏ: ➪@+foWYj!+Dg].Yv!3e<{MNR<`m&=MIN" :i)״"e);O%gI$"N(JEeO wԮ"=k>*<]6JprqIh&`eEEj$8|p!WԖ\Bl8sWƝ1&5pe~ͣpRwIjҿ,IiKyE:jͦf;@}%^UJx=.5On Lp{zczu=;@ZIÔpFx6Y(}ݚtZDN"{߫-֛2a+R.f7A-#n%󱬾\ionJ9YrscT4Ɛ~}7ӏqp^Z,_}"l3[~&8+Ϯ٥Rrp2IW~l2%ruj>3!@6c s+˄&XҽÇ}(J&1}/#$ukr=fbL"X[j9ޚ$V%uק.g Tp9>j{$-f I>Qq|z|}']> 0Nk9ۗ{ˮ}#4+uO'rdQ'j%0FH Hk+5 /%Ȕ6 s}oF9C2=e WR ƨ !SPS[YN=6B+i#EՓoyr#Gc-m`HƠ)p+GLWa_ռ"\p* T`~7A&IV0IYkr03%Ѐ$gϓ9Ptǹ xBb4~LJI6HWl\] TRBk9 YlHYTzE ,*1qTw 'f{yDeL)lsNDțRrZ3blLjN6GD:hyc`"XnGC_B|D@bkzaє˒:iXP1_}# ` BKP'˅AD76;ڧ#abGy@Tct"g6Ys9`' AD:HW-gڦ?6reOjkV_'zι [>JOY;4(& [J WGT9{m#&E1>>y ;xWKоL;"n|E"^bA_{ߎc@.Of"rcۻ9-.P\G6DZ 8]ѭ8%bW3$,a4t^&u_<)l}Q.q&u)fE0]p"ښǭa뵻n -ɽ\~}Өaϻ;njns=4HD\T8+Ai嫮hqqvv}f\h : r7DR/FDlSѿ)݊^L2T0d1֢޲Vr~f>qzaCt_a7 qE_>FT,P(15j`j!hcoI@PY"u9qAX] NV>k㽍v%e$\nqu9f1kjEtq\h R,_^9]2f]r6|LO,d*)Ձ(U͓QrфmT\u0(O}gBJ]d:Ukđ@*PŦ.IIN%Y紱H%\Š1VоBUޱQ!&%=*[Ȓ~%m ѤI\qeR)>jBIbijvN‹I},&ug*1o+<휷{gw66[M{>0i`l!T&C˟CBucG\#I@@@ONAQwEpBSdІ ܼkͿ>qo]&#<.u#OMR릍֫cuF0ֺynU?AT>~Q^ M;yUȼ;|9YRd}J% KԅJ(?c^W2U5"*k_ }rK'IlOoJ ]$Lq|?oC~p7롷]4^_ápnlwRб@۔RhSU5koww'Et Y\~s:Cm Jt)2@Fc O.'Dx9ީNZ|]}vtX*znEOwoQ/7@68.9הTCj#*885&P+쵷ARÈ=ѵQ{u Qlap9O򄎦Ӌ/b\IV6H( #{H2 [T!V@KUa5` ׁQ˾ U8 :Qʲ-(`E7krR.+n]ؐI2bl X Z) 8:VfSRN5@K[!0i9"mDڈDI# @~-1SD=hUMeu7x@"Ap^?[鿍G__tPO>L;Au`v!j5+S"((n۾xDm~?r|O=|O9Xvb 5*8<:LbgVX `(b,-Ɨv^ՈՕ:hO'+{wp~M=kd9ܵsrvWRrR''uN:Hf]-"d-ךZW!y*jɃ*sp^ח?ضg;j[~d@e\ޑ}@r, F>ͭTu]{4dt zet6Mm,*P:AG2 ]ԖĢ kGjqYTNXT˫\̀B&%E!$Xbe ɑsлP|58TaB@ΞZP +/!E:) y?n)k1YP8Vy-;s}@υ!\䇽W-ռUEzܢ]N>frR\_h2{љDP8&Ej2lS1V7[ vjZ=goZ R)r>#[WD*JQA#@PSF2˃T%A.:@K)Ae*hBAzwuT`>(ځq6pYo/f^'s`i)s`jDcq*IN0{5XaCBKB:}`}zTN5עwC1\>ePs9`bBG5tF;i .^ $I)Ď6BKTF_ N"hDqf8daGYȔ6D!sӚYcgg"P[vT݈<"2">w5 Ds&ex(B|D@bkzaєP@lD1?);6[p!hY㨄X+/ $t;!!4~g=m`gYhQM')7w_͍s>ɼLk7??5'_n&sN-{w,/{rql3Octe=ӗ2Jm_&4˓kLarYȣZl`0/==[?EZ߫=vG;?ܜ_qrΆ4/:U iޗ\|{ZZޓ+["Nj ]/>\\=V7h Uq)UV}sޜV[~Cy_S>)ح>.fg!V5X)aUNv—)o0ʳ{e-7o " [{o\R.g7ݖwtkVҴ*> "VC/j|ϖj/yS'nIT{Մ]|.Wk݇VOh5a[f~;jܕ' ɺ'JC$ ![PXbFiW'kL&<ғ7^AB l ZժB$EP :y:AܩGVa x!gqC4Awc`qnw7/e] o@BzιE` ":w_xC$yv ؛/ЧķZR&)iw xȦHJjWfO񫫫JF[i2C)$]1UAmN[ZL軼0Cn/@9f<3Ԥc-33D[0y!xbP6M'm b+tCGjd^ )E%KR]&fo=)9et2E.|*L=YM]}kd,^ =vUm±ӎdQr X@zDRRzD%]7X\PԁK\kZ",)L6&$)u`L~/fA({;k@5)to~n}B!:SHӆEK)ER)4Ŏc8 P) b *cg>%VhfE R/[}3ٲu-wЌȍCN0;]^?]uKh]?.'7wmi=G٬:ݮwkn7Mg|a8lots(lm| ;ұ7`vtv-zCI5ß6mݲĬKل['{KMr\Gr%r9nsh_+UkvtK]"/b~L:{#Nfrƾ }+']jl:NlNXCkuvtrzĽ揸,vN7ݰPCl̬:zmrfZŸl:6}\2l:20[eݴgҜNDN:kW ͂g<9P^5NL 7F85;,pM1 :R` NR.^WSc֫DcWfr#5%|9SnY:qCXҪ=‚΃\4Pr6p+oCJx‹Ǜ/. G):<(oߏ&\|nO}C_a%ڐm{IOؾ]vx(_sdݶH iҞ N:"Ӵr6"R!J --% w#y|ŒZT1ĕsYjq,HrAN+ATfRI9ʲ8ʀ'Tل;$@演QcìǗZȇ/<&fz dž@tXĜ4!xAw3ok]֣. 7y7 '^b̛mgǾH˜f"F0l蓏IRqXt97 8]uH[]F‰uXTg*ӍuM~^$q yTv-$|I#J}F>`ٔHrq.>D"]7XCR:BS]hs06GF%ZD)EԒ)E 0;uK0.3BK`zgFv,@&Οq7kV{?~Q]M*Lb^Lq^jmRDNꪫuNZdtR[dQ+,A2hE%3bq":]$Ȍ'YE5<;Wk6vUBIF&z?Vb%Ks?3^I_G}× 8tTsǴr 9CXTD)w b R)9+ES9oԪq› fm<`JwX9|Ua! , K} MCb0Tu>toC MN+}r{s^C*sԦ<#L*T RUvbZyk8s!q߳FOY*9B7q9xs0힟7\M_\!E(a+l[M!/1Vj8=fJW%JV9CS٧0՘+MjT ƔȆf[ ]g}r:/WaoPNnnY"P(~4C,S?\7b씃wA@? iY Ut=NqXyoVqo0EV#FK ZT=%T 숏#+?D\ySO&(NWIm2hkt򵮌zAMTSOKf LT 6 bj\e;ոZT&q˽dS: >R7pkw7N Fy ÒM[KnӢ$JђdKk,)Qxe| e:G(-uWuf\.g[zJ`4k*>}~.0Y97٤P2zEʁPG )8>jK ^Ȍ ^`A Ɯs9kZ(m|ߚjᓇUHZ{ݎ>\%zOv"t<`V420)D%$@tY$0R:"*X2y;>IWg(렔0 WG0 ('@{T@'RccZad(Bb%#1z ",iP΁:KFbҽ[!C!v;(ҿCh>V,jUuN.Q9'0`w G ]H7 ք)| ?GF' /+~? {_p~ߋJ6w 0eQ,ԡMmr`oj] >s$3;T,Bht<{}ۉD5fZoó5a SouU-a|?& ^P|ui%`{)Rོ(`7Æ*Q1ԪǿkW~6w[%Bղ#vGŦ*5LԄgWt|,p*i0s75`\ea$s%m)Fzww[6?w+Ğ{0}.Y>;VꗂA<ݲnvˮ?ݽ-rE*J3Ҕ)Ȣ4l, W(z. P ,-RuhZz.UT<eW$gRKQ3dFդmǮvYz0U"XaW\E΅] SgWJ;vaBavF*ٰD@®R:JT*ѱȮfO]AU=>_zs 0xs_*TfpuN"6"k0tXt "6he=")r # (y6*VRF.J {2F Cq-§:M4PJw`)p oҚ;g gV#km$IB,D䥁Xۋ ?xț$4EiH{zwG/$EQEVȬ't&RA;)_ވ@*R qPUŤx0M(;jϕB|SH:$ *ݕE*`ØMP0jdI,.hQX{w0YJ'k)٦O e9%=O(@3F?wMo]ag,',|Y^0irX/yx~sῈL'^8ժ,={J!KK VSE`[~TRS#Sdb_lis'yX>DA0 l{#C-cVyu` ʔWًʺc$(@6TuJ0.JCv ^a4Ba(X p? }oۊ}$ i2^H,T<+T.[6+BΘ,yݱu6=ի?ށzsKPХ}T@=5hxRd!{Qa({bT0F$YO t_ߋ=Wr PR2 )Һڄu&˺@bRI)(ΉVgP+NIK=mdB2%EV[$5)bWF(x1E_8Y8Nsg2=/!xmUҩxd2v 8Qf=gTBs@bmEDxlw|mKUu&"*v|8W=ChKQX YEiŲKV)LTgy|x2Iѝz,6Wю$P0kAлI5۩͸I6i gO:rt+W $*"!'A%ӈl oF/jU~*aͣfoȾtqv]:T v=j{tJݓW=$AyAUϐHB{zsɶ H.R?IQ3 Hz:g2R UTH"Q2DcҺhx--;ERneE9႓hCY"RH]kb8f* AȰ>!P{C|=AA{,^~U>+8G[}n=_.;ȝC/ 6Tk_䌇co/n>mZs4yeͶCnpf.wkzaynyogGoFe-bK8Qe#{kOcw)_4wxs oKƇ:[x+v>_þk|CUx7S切}+Ux<^HW9U*N>- , ,IUi2v5܍ڕh W|۠Jq o..huk(64U>7x-O/`RLꏅ ` ඙nPpQZ*d hAYyFzH"P%apDEQڧK׆}Rs2At}L3 [8."~[7h{cӗ٪~\M]%~z zIꍴP)ET=8Ojz2H$H$BNQB ).^A9oqX܏[Zݡ# v<}ٝe|e{ܠbzYE_YzxaIau准iHk0;Ք>6'ԮaUx|lّb:=oX,E_Ni[ +ai|-6KrMw%¦gtr111#an>V{غT#?BnW_*fyHdE +%weg&;՘d\㐾ɲ-TZ1u]}3|u(EM)oGb@/O;LQ BTg!dXK@h5ZtًΣ,|Uvd3KOΉ(H˫I_,sѪ=#݄z_\9.FԮ!j Oe ܮol*:q6Q4@H_E4%CM&:  ) UͮwtUTGzn-Ē)]c"8aNe'HuD x! j%nb?J;#ogSn^Vϳ  ~E{w&8j#ev64G>~DKiɝdj'm_bI_X>+9MnۤK9,Zȓv=.FY9]T>wQ\ҷҐBHΑZi Bk !ǂA$,ĠR)lBKXbʫt)]KͮKCfۄnS??6w ;^LgUt[}'fo#9Y \W) =&1a0*]B-߫V IX;-s "Ɣ G@a x"^E! 6АtkKH=X*coHƌ(P{ ( XʂZDgn .*iUҪP, Bx٪pP;cPy%ڽ^c G 6hbZr&c6R)=6,St {a9"uy껺Rq,*SiURޡBlE1LQWH.%XUIRЪޡ2Z "j@U&\L%ގ{N==!ZpFG0w8r5>ZAa\QPWVuԩSG2 U&װcQWH-*SXUWP]1} #x|5/??͙떇l81ꡪ<x{;xGă:A5D\YaLY)LŇ{3Z \˱r,K-R˱r,K-R˱r,K-rLYt!JbGoCer%9[`>ߖԢޣLUGQzTUGQzTU׸[@#yVkZ=VգjZ=VdX*Xj9ZcXj9ZcXj9Z=qUT3X͗fJ )gjZpXZV%gjZp HxͨY9•}@ 4|iKyh?$!Q}F(ӻDdqʍ&U@a3g ٽC(P d% FZc!B: .F<#ChE{y> a7o[یq3dzqtδMF-{Meq;-.(0+M^ ! WcfDQLIPJGsֳr |uϜn_gf8#jO%F_Gg,zzsRȱCkB5ڣ,PIaAlVݢH}#$Ӎ6$hS4Z_K ?HWй8#/ H˻(), (&^j= rVZ).%urxe*9#Gv( E\'V- 3G2ԡ}*! %J}HqcLoЦ>GsjTxq96~Syf)ˇ;Fq^G/Wh3 ʘt2OZe.(Dud\PdUr![BՏ\ !HYX\bKe@I"2OK :JgGBe G9X~%)!OUX:{t^Oav9%6i&0O}/.vH& 61Lo&B`~s 0iῢOBxgNk'8C]|In݇_ /gOnG"/N/_ɵ=\ 85e]${@VrEK")O8\,e9k-rGpE|8 mm2rJ^6qN MNQ„,Z'6qV ?eRqQR[ \d#!0ό,hhOUA*+HO٢+\)8I)BCeKG[J"0FOޢ`RI!\ 6>_N@|iir]h8ajͬn%[۫ٸsUO?ύ.v? o@kG+75W Jxԏ; u.p6D2zωϟ}7JH|Xոu痤|hѬqmeb:ip=l<ZXLqQ{QV@+ٗd.dCOD|Zf8V%O%9 䊜rbx醷'ڔ˓847.;(ɣD$eDP[| y>oPѬ> ~>4yֶ̖'4ϕmrۢV h5y*_G~դFrH5-tTdp{ P[{-C|#^!Lq&hӛ/,€av76K`M?dwd}+4u'8*pB4).֖JByǹ1Juq|\2.i`8)eZI+_I.B#@AKHTj;;Wx>ũz~gMY:oߏo[޿[12)=obZ?K;ed>>2Bf^\ow҂0U^;|zgzF(*NBwx2 Fp'_&jM)K2r[TI.)MZ%S(HFS-QhheDHJg@}bR6F]r Ń@$F"}&*H,h5T"=&bv5F΁)#g}5 Ҿ'>S8uΈx:zOw7O\yzϒ5huzN$+ύ`T PŜJ3ɸS)1(1Y Q 52J1YmMrh%T#:EKbLx- j[17F~ ވހﴭZk$J8lSL3Ųք8F;_`̜,31~~luǃ|s(+ʾ|w**|b<_Cb_ʄi 6K-C9- R -RBhގE\zvEh{vu<9z&yԪ3Q{bE]t)hhH]! tk?E]!R)(+dOE*lDkKIkUKWWޏR'*/=o_~=c΃w'^ѹ6ߓ{&+.͛`>g>,^~#,)oMc&&1S:f* -0֑NJ4ncz+%?Հe& ?RUQ0*Kf)*c읆N:uSAEqv.sx-ZgiVPTʁ[5tHT6"q6o~-M^ݱでXg%. " q,<${/a>_{x׎8 /Su?~>otyXcgP9#BIwf8mtɩ|"nT6F)RGt]9C"e`T<j[#TgGlTBCAUˉB o2wO/M&\Q1̂TsgьFA$e cR I}4wTDFtTP$HBS49EBk4NN)}qcx3;\iE[sSPޫJ=!&lkOFͪdoq4v㷶9g笽y rݨVޟulyQjBHΑ\ UOHP!ǂAV R6jZ36F~ J? &y>n&izjBw{񷱟%n~* z hF+4OfsFsT^3A%^6n#xc6ZxK8 pz2 ['43< )D%L{;+ !~`3RGemWd1+bIGD5D%UwhBƶ8ыI+vJIk@(qBChjתj v32 J8əISZ!ȵZu4ؠ'C 6(M*+m( ۃJNr.90\)BZ4EOm 7zI+!X%4 hK9q&YL^2j#9JRBb$b*eOQ֩~? =\>67UxStŇ:X%=ɼ-d)5>?|ܲ,+#{tC)h`IqNQRtԹh2Zx굡ȀJ~V_uq^۶@nq )Y'JyUR ot- B\d @`E\y nU@x2zEXWf85|0Di 6&'qS`BeN( qo⬰Y tP[ qQR[E F<3ȣI~ŒR.I>6ppASq6j"L2'JXR 0tƧjL]jpvԁg!wE:Ў usD>u~g7N?18R5mY?ևaG߆ukZaFFH^?mV 9= WD02ϧ\|ijHb1Aʓ|<źzĵ5"׋}nا}?rdMtu(-m|SV@ם`4s< >YJɜnq.WTm}\QB9aWCru$+;;v5rw{]h5yq DCxfr% ӂ ,_wT~1Y[3[^}Kq2\_dIo뀻f4S j\>U P^M5^N{V?rQr{z9O# u] 5K&p<Α-o; ?!|J+<%X mBΣ|ZZ|j ]?]M=dǛy\y9j'?ͯ>|!<P g`4R4uoNlez15ӟzq\{^n})={Rw܊m~&<uEQ`ɥ-l3_zL%v(_Kwm͉Hn f~:/" 7-M$3RɆ5YVqSqK>b96kj J1*JJa)1xFzZ7YWl>soq<[-hqxw{D~?v<-+*.ˆWd@ff倹eYWvxcPMa1ѬMԱ^%NV"Z6.d!m[BeLw7*$\UҝƷ|Ql /6U313XwgB>ַm.z'].n6<`eh{D3hmc5fe2g8? ]vS*)<줲SW;sK]%7UʻbQp>r~\;w&qty* #,h9$Eᒋ9^1cDF0.;p L >+6N!bNQ6^V,Ж='.t2H ufQ Uf͑#.F0pԧhV9>ٯsd8q,0@w3:`4B I9eHr %N~REi&VNyD䤎  Fn:R1᯷SԖaeh,"A ÙiIʯ5fZz]ySJp{V7槞 )+^5bxo )ڍ|ϕmeZbZMv6Rnv^E_d+&mԈGːc"I0UhD-w[W*VoViya7s,;1 ,X hIfLZn%SYENeLC}*FwVp6+Umj1קԌ~%<&\@RVYWV&a-Ɖ 3akB`:(VUov}mN]hap0ovj\Ⱦay61Ҳu06|%jȵEk{lΠDuSλ*?_XWP.HKQr% PRj, [C"s&)'D fV,K.K~ށtGNz؎~,~~Աc>ڸu;m _ޅVfueWW]Օ]]Օ]]Օ]]Օ]]Օ,VM[By2QO!:sE?ۧEҽOÕ=$!\m2 >'޿b2{px*v.^ sE32,EY62~ƞea}KkO { m]mnTm;I^M_NRMȆV==חKͧkE_;4.,_`qB~/ *aj >'*qyqs0] 5 'ӿN(Qwު_՝Vw[y;o2URZVzWn=[㺕ii*ʹڬD"g{Q*@LP"AF8%8ZS+܊JZ8!HeYҪJҪuVyԳ8}te 0[Y }Xw8}$g+όhApLu+M%e OMbb""bhXb#VG$F^1SD)J@”N^l8gO'yJw:5'_[eMؘ{jY^ȲGqyao !?13$wx_cˤAqp}4t׏so-:kWE4x48Ĺ km٨mR$vㄹsp M8a)3k싷l罳 \n# Ƙz(k Zb3jT1)?8)KBJǤ7` hGt˚Ǒce(6ґ@Q$h'>rQlךDpJ^<.6#jQӬWg*Z''U$_$ x>vTZТmgÚ7;k{eQ=e?d\?ϹK#6X))5eXР9sıb\jE>l_U !9yG z! q4tJܪ$"M JKb9,-2NTYxVYQ~CpvcYlB~|9{ &GH”i80f6P̜fƤ;Ƀ&&Q@372WMljʁĉJ ,S9@8LE` {I#^Kͅ}=-gˊw1 RqQpƹ)52JX\19;J ulTUUiW3bFkw_`FDb;q1V" pTD P8 Ӓ7;ht8A q2`BXi8 iC׀iTpcwPBF2rz#}9pXJ:¥N5&:l` ĜZZLj2j#3ʔ$DAې$b"q,^礳O#ܣ0 MwwWA#֨aVI!"EXd^GZ")jNȹ%&8 *h^X8˶d5qҶ&x䪿S^CX)!tjt] Sh@0R4 V ȣ4 E`7[9!;1⮀k((a7Q;䬣'9nykܸ&׿ܕѫTZEKj Vh ;bwS)'ZQk A(czfl 8 TR$NdIr_=qh})KTߊ 5-zt<Gh+~~xmww|]J mj`.dxvӼ LR%!񜯃(D@_ NB([MST.6+GU5\cAcE\qʈڹB­L9:a`XDr&Va ц~'2+uކ[w߾dE%]X֋G:3'$U%R 0QLTε6DtAqh6TzY3UzSz-0ya&!$Sa8E4 3&$E\x8/Je2/?>3 n쁮[-||E< rf5нDh׹ׯz}r n?Ftjs8ol6.z|YO_z0v?3Jb϶TW|Sg\>>yK\j_TcU2LqA\?.QtV3G"XkjPKRM^ h:@_6 gPӡg: n0?}M+ۓWN`|p QMviaS] }w6?/zYe77 Q<1C*&pM 1A)F@Q),8<hXOK&+ mgn==-?΀#hAeoV{jV{P;YԋTji듔rWIwZ\v,BȲ`FgoPer[} Xfuذ3?rO-v̶ᙇF-C:&#b&WAp|\2[,620ǑydݲTiMYA`'}ʾڙ[4*٨:P󁷐㤀og`6 wq_Sl[|YE%E.E5dFڙrp{==%6`YUQDRk%XbAF Auًl7LϏx0ߥ*xcWoo6kH&z=?SZOgKU S.# BU\HNQ>@C鳻tve{=iy]Fg'JR =]X & H+cָBG#u2SX%Z5;81J%u"ZlFnfաU[tcrXƒHCئlғ/ZNi qˎzj~w5(˛TAm,,LR &GR"UtI!ؐIE4HlH-/\9.3복`1TCqTH|$*pSL0mD V ɌZ*r mYq kژ^7fl)g\}yb¨5 diPȇD^$xƋ:%6b lu֊R9ЇAZKE~fVQR,d,1Y(Ql*)E唏j5sh>pҀ»ۋxɲH2 A2b(\UYI$2a2(6co!vM)DNf! HI$ҞͳQs|][2쩚AEDijOlWWLc[BlYv>۟4m#HE,h]R`g.$7)o: Hm>1?J1+"4#\pu\Η²NIDm`wv ȃ1?e̖Q?Wzs^Gu&G?rDG?0eV8YRtÃI \M*:cq}ߗF?gP0O]3l1%"0~FaF9S6ڛ,M*M0ʣA>x{~ :>cF\4CYbOf'?~2ջF>\F411.˪OZKMll6}nyn/@BWѰ̞Z5~[oU(dπF.qH̳g:{}NB+cVgjMNlP:E.Qb{:5J[ڣEňp10ͿM>no(}o`MZ~Q]6&W 웷 "368FAȀJ"ib5蜍rP%TlDBng/$ex:rI5ɢvu0w>I~6 TY3x%g k/]ɿ^AV̟i/V̗5\Q1y_@\]B~ ߿[,Ks>v OO?ل#׫'?Q#Sj0*A_dy9l5LYe3V7WYK||o1sf-s5!IB^]|X|gnsk:bϱ Kmw^Mh噟ǧZt2[(}`O?.qbgk0sؚԓgӻ?`꽴A[W' uRX((,)hQh5Jv)2$ &BYt}%Sq6y%2 :1u٢5353>0wKyZN^/Gonې1SҬFUvq9gTJ0*μ0W`T0h`. [k.E")I'g_Ltvm"ce-޽ǻTPYKߨB# PF0PcJH+RVZЎ!)sl}-5VVP98K’p! fi ԝL8ߥȐ-󒈯_ٻW}7}씺Zdqͥ@Sg(DHIKɀ't*mI(,ݹpTKS쥴2S\%f WrJ:Piݛ9wK~+lZ>bKlK<`n=мW 򙐥qgHG_֮2 tɸcH+*HG9Ɍ<XCV4x!|X/so;|%o0KL.F ڐ\ lH" L;_]R)!-nH~R'jg]cs bCT(3Z^xs{s jʄ9a"9QXr(0<(ʐEQ5 kc>$1-P؍(sZίoF!ZhR/ھn}nƜyrG}^RjjPELUn6R{&rVO Ƶ3/gH(l/C =??8U2C̷:CΧN%'\6AiIXtFE)!"Hf<HĂ⍃ d ]Я$yze:Ӑ{"oϽ`^,|| .fKSob^a1[Ӆz¸rƬ;|Fr ԁ7s:E$JJz8вECowiGSVy+l3"Ƿ˘(ߘ7޲{@HosoQX^K&ϗe9)Ft"HR"uI2DQoβٴ;֯m'] /5Lݑ̓'P/M~HD!5J_*EJwJf#T \ЀNS̺mdUm.,- EMq7rP$)Cd-9Ѥ(nъE4Lxx5jWô9yVfɽ:)^uI~2 {Z.ܢw0 \Oaي7PKY,jޘXΓ:R)-J:W%XcSUmÒRQԒ<ۛ?.gh&-N` +4 I Ҫj4m,D OX bio "] Lv-ؠ^O]Yo#9+B=u6eL4vRFGХ-YR[rzo0uX-YS\:,L2Lj.VLU%"3LQ!0%dQ 6j/B*G] ^bb;˜1mLkT 9d`&" Ԩhc$C\p%gp PVVOi+:E/-QM'wޑifqVDåRY%kUnOt4XրXp7bGgLyHc!*?s,K]J>O6ХujmJ(+sɂC,Ph TBͣHYYai'vS>ϳӷV׆B3ؔ6r~8_+jPf6Q&UFhQ&*%XF[2@<>[40tr&Y^X3xֱ]cǒ{:ЏR?+&:]ۜ.?K8uN"P8veiU`7݀]S5*g䕪^[SGtzY:Haׂ$%$2-R@B_hYAԩt<>e \$zq u)ꄂu-c :6:Ύ^r72>^T, 7% G=ewFnkR:.wa}tJQlhr1IgLdX%wjvVE$ g*:}о[CeSF#=KXror5Bzj xQ xl8V h0`]1Y%Se|NI)Yp K,k34&)D'7XTNe]R.ȘCXr$S\D'}"\[ ='SHJ@M<\8eZhIfYQ0lO v ؒ_Biw6e7/ދ>>1)apR8SoL 4m[Ťf<͒ޯsK?F(u7n (Sa}ś6rPpڮO|"E?-3ܶh_wqvUdIZͥi4ݩ)Kb8gЧR3.g+2iS_1GSp^No4+l R{X(VUVGdF'@ ֑AK`!}FZb\B RDThD2s5O^_EϿ?3s&Q& r.V<_+hQN©GvfޑL3gUڤwNPŃLo4ʆV(u14ȍ6@N[&# V~1~sbXrU>@CDr )`(~PbцNU)\ uw`nuhQ F3ou0O'+.uCͬu'tluopqr.w hyVyv*wҶށG:N瞃bk:Yb[_ma6埵DAD[ %(ss|mSE\F#?O0g2byW dߩSI%m򭏃f0(z{_ds5zÐBoRZ!\wWlo^&k8 !ohC``v|_7.7nMJ?ӿM:d(q dcJ`0Á9VWB BI&pQt+@Bs9SVۄeýТ[7D2 ~kG&ǃ|g]WV+bk wV* "BUؠR6.wdGڶ=`z~9rǴ˭N<^zBf4w,5F'PWyˋ~cs<Ӆ{XgyHsڂrqہ3ڤssVd&&sf`Q#nœe[SiEYAtaT-tho*)Q |WVԤ8N:\w0*2 ,Dz`gT! HexɡcSTx8b&oru&!vbݣC&ׯy dz]snJ֎TP:̗Oo@ 8.}Ò1 Z"c̱_ʑc]lb&[U2ĸ +<F+-.vjf2L1i4EIN@cN#miAA@:EmWy,{1%X6d+'4 L/HRf*C*7XE)>)Ց~HQ"0[.02EY\"n\c!w U#Z#)Rڡ.J$`_Drō0> ,s1$ v[ܳRh8 2r %9a"p¢1IM#Z;1:Ύ8;JeϷjIbt>E'Ahų2%TB3ɻuZD1 .!{F>}`ZdS)#ZhvRgnlNȥ!Eu)a FjjufjIOZ+rxIy`,rUJJg{ 2 2JupbQYJ!2z½^ 4NDiYn5iix_~ҋ_eIr*2+2+2+c2+r+2+2+2+2+IY_Y_Y_VY_Y_Y_Y ٤eMKKS:mG 3p761ը WP: (ꌩc!'@ZS'd% ^Fk34&)D'7XT.@ IT)sDYZ cd̡H,9e).P>.P-)$Y&I.XN{G2 -4d$h4\jOVdDاR;FLx!+oZFJFRǤqI)V?&8ͤ1,A۰IOֻULnn"?:Wm\a٣ƍ%Ӳ}O=?Lx^ƛ^gEZ-M3ʸ~Nbݧ=㡩IAFa&ǣ6U&ɕ;2YSJe?H0w=,aJ_Hx{~x[kiT6G_O{s'kzdp"%6)w![D{튳rz2f'FKe< ^dJ:l/聀#q>2FV2"{,凇aVͤn}.a{s}4'Y [Y">cm}\|*gg[Ef:ڡ\U> ywjʒX:Y+z\݋hjlسiдX/cB/ &|!]_̀A8D)}@`6RQ]{y"hz `!`weH}l^ CcvvO[SrLN+%Yݨ*[I1G8 IZŸ^q-Js1 W,(sO>Ϗz/ݎ8CͬPR`HJD@K,D0ޜZĜکb{DcTzƠ_c|*]}_ q7M\\Y#*cEm+(h݅(9\#SN7~ۃMf V|GyvZBc^wOs;{`XnH¼eơzl\tR@1)jW'E겧 dOm]T' >ipT!JnqR1rMĿ،'E rZ}*|pUv]z^Rrԭ`Ͷphأ #I)z2;u' 8 @*Vʊ$8#FEr8}b$Mb$M-XB} zSce K!)0GT(2T9V"1M u@ :g(@,J@mrppL,a1r ds@>C U ;~]\j]=0q2_??Ezؖ诓i|eu}fp{nNû~|}q-ɣk`d3ud2kuլq:G=.ts6e-f}Λr~?>otos9C(2]fo}uOƇYhܝY7bk·?ozx{smz#Vißֳ{;X9a[TF:d}Usu ÇVy9,E&Ք~@Be)2t,EҰ,,K5#NqaaM =!&7/D 7v󿙝n]>/j;+ ?^k6{aRBY4އ f3%hP;m3] |)s{dma: 0Xn{o5kaޡvyAg,䓞/×59s g3sC9n/X=S|0SLayS˖>*qtQ:RKO<p" %d\b ^uIm{Iw\fE͂\C@P=NLS.?3 ۷޺p(G:OŶM3#D[&蠐Bapxo1m&9Cz`+wPiD@'=:}537?Cvu (EMzᤀm es1mDGgpʣ)\Yx E&6HEo>M\|BmK9Ċ퍣rG8ClQ= ¦˪N򝠷͠ BHΗ, x'f\ +wϝe8kdelHU>l >|fUvD2K޺:&|PnY!,li[a PxDžh`jM./y) 9:>fϓn^m[}m2jAXۑm{J˩LU7p8>|[6[=9Vڵ4ioi]ik.Άv?]@J+W3[IJŸ}H>:yN#W k B{O#0Ղ@X⍖kΥKPVP2qإjMxD~r d!7m<,?|j¾e=_7:O(H`Ziu!IQX"AAw$X3omgݩmFS_vk 68Gؗ@5f 7˳7ak6unG,MH7yW"MtU9M'r™*1TԽo7 l6JP %&2` ܙ!"OCx --fl gUv1 [ϫ]1_7Ӷ8{w!'pr$xazh9 ChY(ךY +z9F1wGn/_=kLXx^YH@QC A*E)Gi75(gjJ 2U@ QoS慌`WWhu|#Dqh%* 9LI OL Vz4&O?C3?:Z/CeM CnLj}qI0ϽWhA9fq`SbcD=  1kd4Jc,ښPK(Ft*Ř[d*00#g?oYsԒ djvbV]5Ki s52\KL]4kfT3m֗@X6V0'CPUDiR^iC@XT`Aw|6"@mDGwFR$b#:ve)q{>zيEgPQ&Z.QFP3$,$(&So:٪ysn+TVI"Ed^GDCRԒN`Ik'Dc5Q =aN*'I&NI3zgD(3QmD!HR绻жOlfTv G⾀ּK\cICאHsg$PGbPk&r/1L+ܡG`3$ Aznl $ TR$2AJ#:j>1aJ*SK6Pq.ZsM&W6(-}5eJ.}hr[/ !NRϗ%$CPx^T3ReYʠC/\WMtуƩgڹCQ V&⹶w83hAzU@՚$(4o}tV,zv-! &gRϱ<Ƿ*f:]I%5TuAHV1Y_J; $uWza|tY^c#yƴ{EX<#~>Ovyccyѱ>w}^?J9%yQPp]9C"e`T<j[#T%U+O߀}erEMr0 R ydshDRfP:B؎fu^H@aHG%2E2>9,h)D<Fԉ"zEݸ#wU;7^.Rltٱh:~ml% ,Z]>ܓ: F]-N\TW̷vKڷϯ9TT*↑D^#y))5D9%ű\iEQ^RBr伧8e(zG 8l :%*I% TiXq7J,,3%d^( Mղ%a7Yu'֠p0~rS8a2&(;B TKg8I#b$h!60ceɥ *GI E|L+]ڃS=Šr]lwtEj^jʊN\E h Qx"^"Cج܅QY[F|&z[1)!R*BQ&IT2#9dO^G>Fl kni 7lIyD_N2TsW526$<P'jz~eOX#$bEH!UpJLd*s*V٘~̿%|H%^beعMsp'C.?S:GU dj[tSM2t7j2}Z(D5RW-p$,3 xd`vv5gk6hob=]i)wovmE,thnw܊[#nApܮY;f|zbe۪̱O9-3SR` Q HNrF."ImrGZ׹ҚM>Chhvx78{cng?5,2zVQ?䂕,Y 14`  8cg!GNPK* #@0ZhޯS98Eq mgA  cmMC }֠ >P1u=PD 7xIXfqT9* cZ4ȒԖHϱZZ͍ލGRJB* B)QXI;DR"HH2ynC<`/#!Ȟ˘w3b$487yRpu*T#<$%R%3mH_д>`€m,0 򲹣QNbAR$[uu- G1*37>;:cM;!K*dȿ)aB )1 [jcb9_ߨGJJ=[$ &1{d~Oh/N GRLTLG :"Y//Ȓkc2'1V79iTyNuJaհ9ȺQ?Fi|83o6&f?*a$Qͺ}!Sк9jJ-xAI"?&Z+5T*R66=hr Kd1K,1TvXtdàTGoтzbV ӝ6"eP4BZe{]rJc],- Oy@EE;AAAk!N4 / h`aőq.5$(QG>x(c Ib1:c#6bk(C-E $` Fc(J{d,S؁qBނU] !1S;ž7Oq 6L`&ĎJC/c.֚c@y\ %@ z*@AN[*CY v-U╂DC@(4] \!IGvn4Zx5{RQR1/ fyj4ABb|]q( ʚ)%TYkBo edzkEHBa!>3nPyre(USW̆(N;FB.؈prBFeOȾMϰ`Y=\WD$ӓ߫^p "QW}2ICیI| x'‘PӖ dUIJ6!*6Y$ڸ̐}uLC˓]@M͗X~C{[!HiRd5k* C.zpˇus4m%2A1|$kdmBTm;^XtuqR% 9աQk2=Ww^ug"ܶCkuzupiƚ a|; JiP+ x{ #A#xAԑJAy1i&! 5]%b.i`!GQ} $$LCS{pcUۚ#:Z]l%X;6@B`)C:m`I%+`VQGσt&DxQ:WnacQm \g$1iU/P3A2A\.L@Go衇Xt&M*R*z(Rh Aw ?+N flNܬj-ѪO 4o1V #iHEtԀά o=d8N֓fZOЉCXm(u qL:tk o8A$\t*66@:JUeril" jEEY 5wW|IAt8uŲS \hI5|WW,)ޕk)DdC J[ R ͭ^|#ӱ݋[ ϊi493 MXo{WlqzYB1k_wwn:m'[mnA..r۲kimLtr'˫{nM8{7mR-oՖ^[يO?wAWVh brvfk]|L6["AobJ+J+J+J+J+J+J+J+J+J+J+J+BKu~9̵W\A`e\}SJ+J+J+J+J+J+J+J+J+J+J+J+6+`n|9W,FW_"pUfW\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%ՓWHCѼ$*)/Ǫ +X`&K=+g\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%W\ p%WD0ͻ`ѫO^> 7p޵\ G澽n~v>!;gfz6}>aV1wςsÜaz.+tz15饤+瞮J$]}hhKIW0תR[ksOWl3t;aS+O߮_8ontWGg?R^)!:sُ/RA6wFWWLhl =(3,_D FrtzLj|=h JTQ4K/f?T2!ꉯ<yv??/gAX'%N}u|[|y?^Q[0o,gO~9 }~j c^.Sd5|I l9J͍>6^`VԿgi$ד-x)3],ndNsj޽~A82\ҍ"݌`yfY[L=r}[|w(}97ooˡm ~8<Ģ\<*GG?{"F=fv:|~[v$Va| )zyw?@Ձ> cO]ʛo}n.ZӹSSq$U}|ҶD.1xU"xSͻ=V;pl1{gay^uÕNΡmwzjAPV+g_\$>ؽҟ.n̸[[}߻vwt8|>:i6t^g)6?ߋU J 7oO>/t~|2/Ry|(nns"7mw{[!G|ɑhF# |t:S7V#;YΠju`c8(Ë|ȼFRRʵHIl o8̜{*;.~QoK 2Tf|Xm[gceτL A5DMDb& )Cߋo=g[mt]/5m:Ѻ6VmXhMhbeR^H$}6x׫?/omȜW: _i=SS3?;W[w+w\]iuЕRUSMpQENTa$i * $iDz_\}bX] )o\SoW N}k[ /׋g'}s&ݧKVD6V*4Fp`r*3ֵnlFy&^JKM".;2D/KM" >?qM#<EC.RpT#)8u*Tc1);RW!ShJc؊Y]}x.1<6:E%J'e~z#GJgZ]DwnCUj6utwzf&M O ςt1o6=&v|min@ 2QM3R1\4KM/&]#{P (勸YF02i`lKH"r4 ڴt֧TSs*8>%V^^t|`Z>:"]CnFs=ޒKn0 ʻj5NxoUHxUZp;׆e ,E ie&{mu 1k~*fẤgM0A(p&T  fX޼g窟{cny.mWs5+Ee钷ꯏd̖Nygl®ӷͻÞ8X(M09O @~sDUx`WNs]Z4g*5|U~SGRmOmQ.jٻD$WPkfF[v/-93Hg^fZGy9l[ŀ)8Ζ݆""24INHS*/G\{fJSh=?Mt } >bޝ?Y5]ԟs׾Ft73EIiI J9.D0J۪WCtu(1uWf<wt5rwz7`M:Ypˀ7+'k6{n% -K?>0tʖuRr}E}~?oUIokf4#awS Ԋ!Vy" #n2G_u]$f2(`F1+`$8HIsgwzc Ыkr(Zgɞuk^K/']c#2;۪dp3jҮ%6cgi0 oM=S6&xޱh1&n!&AP7f6C)k beΩZ,F(~w z,ffyI/r87b ǻ|YjtP\K0w`a&yvtJz{#ox EXR0Tk|^̂{ 9FI7lnO[[g*~cat|H*%}p GAB06^YL AHRVa 2LwZ D佖豉hjDfgύon(TdToh7l=-/6îk9ϟQWcE£7WSܑLpz ?`8l<o+V5;$ }4W51ோ 5puWOWWӫ?1![f= \YL[v}1Żry0 7w{)A?W񜏜~tބia{9\jΧ?tΣkN me[^mץ!ty/Gn%6?Sl.4{l>x/Y5Z[\!KHVZ\1H(z 3Z"EꪹΎ;m_GJ3kW2mq7I5ߟj3{b,3 g43N?%_+ηntU}DoέH{F,B0#”,'TWˆ0+Cʘtc !\T Gӝ"֞'-8(1W nNmïtzgԼT^7^zokYenݴnd =!ဴfh=5YDpCJ,s$#9&-@ >sݜIOa 8m6-bwY͚}IOC8O\l`0a0"b vu}f*FqbJamՅA]r,;i)%ԫܰkw 8^_Lmlzvf ӚRV~BUc~2_Y4-7f Zx{dfbի3׆ys3ۼf.~Oͯca%o ]f{]HO oL 7 -k Mҡvwb[֙e i<덯Vl"J!* ,% :`cfc'cc 1ĕsYjq ,HrAN+ATfRKWeYeG6u+](;r91jlyxY1f:s±!,%51';n^`EF t,ƛxg=]fSvk /_r v3ؕf-y@/zadӗ))e,:uu㿊10 NrH8+*λ*TO(%MLryå O=cF@|TL#,8ZI}%%ԸwFnfSh^N@mU bےݶel _ޙX j".ZKiU@Jhi "Wgq5\0sд;k! bJK*1?8AnR%9$ea fRF@Z$KR뫕B*dbGŘܡ{^fA࡫N*Ҥi\X[7w_KCr=j=- > <HI4?Sczuq%~= 3NJUULX)jL,C(zu]}7ty>x`Eyn0|ꬽy55xbaÓ֏{XΘLGUX-zg՘IDk1hFs+%"G÷wWF>F2:ZaBrJi Xm"8p$01KL[9\"yCR"jxTq8`723P M˽)9&n 7: K(Q:P2|jKqH6=iKòCdTWJfhm߱ +org+zx /߿^yI'U:v(S"NJ%z c9AXZ/QCh-J,>RL"r飶ěU`̙r{lQY|a68.Be_/\H6O9Vu)޼C}㧱f|T<;؁ 8"['Bڐ#0#+a5 W"*@l5F!D%$ItYGep) Et!"'U\vKe[v6h&k4SPtHZjdLKFrq!˨3*SYaf1!fE;$$GQIO{*H~6qv:I5Msǡfxs8+0BB2[g( x`1 @V% 9r H\7#k߰vѩbiZoimaG0OI]ΌElյ$Ɲr'*dg*$# w#ХH IJJߍwC-O* ,Ÿ$@ZHU6A]qŋ>VF0<<aϿ3H'OZuMw<=M:S}} 22ƟKRTfgl/ 3t#BaNap}:ncg/l KB7 P_:?w WI֓2N :;|j[r[WCG-nF.T ֤$'澯P-EU7Ma} LFbdǴcZxL i1-< i1-Z!UxL T1-<ǴcZxL 1fH-`rnПJ%9`!w0Tzi z?O&j Zn:*_o8?wvl^s*1UV^[1CrVQ3NTKqw+5N=F9Q <Nݟzz h |2ٌk@`v߃yq3 r rV.ʅ[p+n­\ rV.0[p+n­\|aV.ʅ[p+n` rV.ʅ[xZK+ʅ[p+n[ m2W!֪ +cG)^[ZNX;YNQ-Kꊤ\d)V'~z+  i[EĖP޵q,Be #[8X,d諬5M$e98}Ë$I#ƀ-KUWu࠶!-XZ.d{Ñ$g.OW y"ې"%<@xbM9Ʀ7*jߣѽQ)Q#J_lѤΧ~4>hW(է Xl3M'9Ŏ2YsJBhǦ\E#BN;LǧK3cyX22;KTMZ U*^gH  |DTe'QH-Dȉ(!DaIIMfDHhI2ge|pZYW6 u|PKR+u1X F;%1"$+gޭ#cךEL,% |y--=ӿC1'dmMK} 29$^ X1֣= jٰדz=iɼ "j YIi!XD0u1E :TuF_h G,"Jy+*"!-QQ)CIHKY mPTUmYDhlWD?Tчf+y,󩎙oQb{0Ri.H!ъq^"{tVX(yL[#٣;XJH<;tܡߢjZkp*ԤTWPAn7Nk/p3-.)# ] tLU !:zc(Θ:0})mkRŵ^g8# iɘ\f$e+yҼ2&bPܥą`Q8TI$Ed>9ZkTpZ]QB[F΃ %:GQLV_nY";n$4 D>/5Z0<`*HDR2@#f4d8H7HJqDrIKU8A_& P~ DQSkQ̺Ƭ<:yFoMFat9B=etJlLt14&)z OBO/%0RG R̭'4!c撈,ZQaI&#y"lgn16}TyzmtƹdlNB 4!𒞌kՊA` >lJCXZJF§XmRnq.FX~L0WӋQ1&r[Ť]u;v9Kݏ-U:tpDQ2-8p8?ߒ0DEvP/;6MMZ J-6s|9^e\w>PX*-=`JI_] ܬ#P\Jn`|]a&Ny>F^~8>rWIc3zvӛ%wwѦ% N!JIJ9+ת'\7O~d:#$ m<[$39 F́8#TJExjbp7Ex”i4ww?EYW[B>>u8;I2VV 55AԔ!| zsiI=3PbM 5=[!M{H1EW$ӫ%]$a,υJ7̶ۃˠo$ MnR݃|Fxg'cIj%0PHJcWk^J))aL@QtK>@GEa{e<$ h;p.V4_XUB: UN̩qMUv֤=|O7Oϰؔ?A R.2Jtt@3R*<)`TB&3Pkwɏo}i;@ ;}UG.\ACDpzR-O^#hR }i#h,OQ3%?dU#o5N~AKb|=+_W =_7l]^5u^2u$^ʝ ]W>أ#  8JE*a (* VEOT]$|^/h{x>7= Iۑ8J<=)Jd1"g,'Z[!e#}:EV~iin*3(</eI> 8l=]7R}cY rЦ%vm.]B.~4 `u=Һ=-hzO6\ͺCjYQEZ;=yiEk-wd<՜|7سK枢[q$M^ovMڼުR=c5՚[K7Wts(&>_7*a6k q]oV ka|7ąg9Xv7tڦR *@o=7Q/+=roq_Nn鷼|mKܳ$n]""/W}3mZ"-ڢf_WD`)mq5xq2Kdk_&̹:21eBK2ZP^&Ra_& hM7?_kX܁VAz^A2QBe dFݻ'- _]Vy5oM  _lvσ΢C|igeM~~e T4%\q;%>sap 32q8~[1Y:~Y {ԍ|~_`&EFˏ%tGv-76}O?ǿ0veU3YM77!KbLGZ0rP s))P9An3}tm/SaCXj֢;OlQn^cR/[3B&utiϗ×59sg3fs:3cLL3扚o -$Vzu!v(oP] βّ+<4\Khݟrt2'ox>'y Hl~T{_m_)QgB#>֯q}ٺHg.'_,3<&e@eZEZvFmbWc:'&Χǡ2*fŃi[& @'=z,jVn!Cn  (EMcގ 'F_vjM*BYec<'Oq S@{m,)c;M lϞ1!d)fg7=bsߣ_u:P[B z)p$`헟ēy/ttu^)<ͪъ.׀6JJIꬮ*aOɶFg6`􍄴c;zs`^lI_Z7eC c-b}%C fԮkȥ =8 n~j:?黷Sl>8_2=fQYTskS=ְ|&ho$q+l{'h8>Һ)?L;[)+๯ ӦdpL /Yv=ҩR{C}PY!0eB`KUA@jAdTkFJuSvb3q2>ZK"|D|+=ƋUO/m4n/w1oq>r%w0 :]i/ҢgHu@=v¼ z{ ; lX-&/wS@7ԛ oX?~C1B+?晅*<@lu?.XF9^%^Xf+PWeUKXGoPfJ`KvI*kR*DPktYb"Ι0Gqe-[]{f,ˤjQuI[YW7YulBPKZB=u=8 LW*4hP&.x]H+ `p*YNHu 9-Q Mu0CGu[tֻ1ᑹl=A*vo=bƞ,{zB]EW4L*RmtS}׾~-9s@둔=pLaF"@x΁)͹ xB+Eep{5:AΤ DԬ]5M+t d}h2 #Od% *5pQx-*pp9 );f3SL:Yε$DsItGB-10cl8sF [2-GwUݬSKn+zrOekk]+k.^{hJ=d+PzMh˭] ˾KVo`Fя.Spb -̮1p4xKc:gV] |7Xh h\צ9w736#OOCC_+O(fV4_"ߗH%zyi 9U-Z+g&QM&E#V))e6v3ϵ YQS_t63!jZ9 zY"e">&lyDrl$?9\ x,rV25ۑB!j}D x=Џ2\)*U6QjWo𸒌c4([/@(c<|??5Ĺ|P?#:kщ699;Iӓ'OEOZ=d[6חM6K3H b,0<{1ΰ1޳,+r&H HX$ٷ57YDjI>{gHIF%[LOOtݻ.!#"/| (=Z5OxYωr4jo}ռhdQ#RIp.y ݏR'輭1YcZ\(6ua#j\@!J(>dGl/)"d@9P2kLS1h~iP{ )8H=Iħ Dz_(BjoOBhiӟ(rQbԹ,e-Yrٛ"ȇ:(\[Yk%r2Pif~O@"ae- VoJW8վ4'@%C0|gkƺzEb-^"z=$s3%q_-F2"5y5 .8 vPf amл゗Rac=(VÆr@HNŒ} =Og !':1|x<x(NeS=փrOhk) >#'vLBBIkidș5ڕ*5sC`#s*(sLjp e&U6h l[cߙskЃI`|D0[R7*g.ޟ~5yQ4mǼnl޺E3V^n D\9%dLѰT*ҹ}$$U %mQZ_ P@QnZʹ>xs<1"점g,gc|Y0Ja!Ze#_ 1$K6ctFXx!th2V+ wc3&dg: =鬥?-@T = P$d1{ZPLa|Rǩ`HL569>3= 1BI%`tHJ!eZHD1S>X0՛;i{˓a9! Du,[a-5ɚK$QK(0SG?㠚jZ~||R׀h`#5QޅT>{QfųVs|UۨYSaخixgB|COL|zN|{ x"X YEivU K8\ƓIb c5qrys(<:LaG' HaDb G.lRV[d!Y/K"aZi_YwMŪۃmYoy}B%L:o@98&FZteWtdf/E1|Bϯgjܑzy^`RTd~:e1bDβ !Е,pJ-B <]Hqyz5ĺF{Et(y'[˨$THY(9)\a:@ò Ԁ/qD^eZAЈHXE)H^ۤZ ٗ@,O1 /!=SHڈM.:<%+0ǤYZ V0UtOiOԎt5Owuzŋ _gՍs6դhK3;;:W&twZ-N;tyQB^CqKY?Ikڷ0q}7m26Vh孶~y]~MKo+[.m:rr6Ui1rp}v:z\34_&_$\43COuPe@K/'_ǵod:w|~1~ӛA;Rh|c۪Nl/ͻ˔w7{lvfB&Q, e֔7k^Y}oWyߺ?G18ە~\Z܎h?Vd'\*b1jM u}i1Z]:-;uk_ӎWojlؓl9eϳ.6?Y7YW5fjּfA[x^Z;U{nWO <߻z"$: < \ dP#4"BZ9JVtԓqƊ"1ۀ#N*pZ嫳tFEO0׼^ܩqN8U_EsOP91GVD|)Fw/4څVh}zGI|KɹhkBv6DlA&7 [@%?CkC^=g,"RBLT$L)`V"sX IYz9 y3*@3 -k)h=7p(x!R^WDjTJ޵W>[:vl4}(mܪ }vmMvҮTWK;Z/dԞ>m43TցX$DiP?cyC#{|d@"! QX9=9Q4 D S|R$r%E9႓ЀBY"RH(IR X3 AϚaox`|zӞm9T ,]U{llkg]DŽz[ofWmqO;gUbRx'W[s\"e_ w6[`sIU=uc0o,Un G믣nsgLsίN#̱uiw}ޯf|J{̼2r7>L'->ۼ]>8?Lov3q|E'2z15_ x kvsڭƇ?^QC=X[L9w6/dڻ҃/|~i8XߩRwiR? JMy^[aW࣯( -*5-)@'mM3 S QYsW4"S/5˦H"Td0Q8"|ފ[1 o7:$bAl@m0/vkjiATZq=6#iS_ɝ20WyVaAK[[,H:DRi-L#T\>2cc48ssF^B$s%R0BCфr[\Z,`bJȐ.GZPRFb 1Jg8{})3{uO?X+me<~'vs޲g~:Ogݕy BNHs(NYt5S:Gy&F LIQX8O?g2c+8[<!Yk=! C1'Hc$Hₖ:Z-R, :'M 8_sz V:V@NV8ZSҡEEr60P}oTe[b]W-GCUꛔ-p?I_ꃉW F{f#3dIY1%2l22o7ɹxM>JtA x_l!ABx.`7 MNaaf\.'Nq}taO\N>ImV0 Ѯ[>^{/'1i`y@CyPvUfߌ;*+Hk/_οB݂6n{pRٔEHZGv]5ʦZMN%BseAOEOz{=yz88aWzS(r!*ABPYɀ9ZVH>{P*rTŷpA_۞ÃU8뎥T^( ێnsce?T/rLS݁;Oh7>pkV?f}]S/&ӛ/kG˼/?}jlX$<JзbUeIJYtKg^g4.y_h}BU L-'{(t>t=iNx>H{{KM` snwݥ1&?lxbwGm~z:u_0uۯD] Nw#zr:|y:xGl'(!N,lYQ+Ϡoɫ>2T!BM)jTRTvP}L) wd=)q"-zHZbOe9׿-緶Y3$M&2<.lڜV<ݣE/xt>zm7ލFO_:.޵Km}7?'_[ w݌v榄 /tvܶb k?ϧ=ۢKks_+eo\k:aBOimƫ\4 MEBY-{=|us-V<8VmWJ.F!X󺵏l$aNDΨ !ndqg8|>fҔ-:B ; % ~|_cp#o]&=C؎]޵#byN"133lƞ8[N߷-;%2e; `9>P_usKkĬDz}Un ߩq'vxEtcluYE)N!0j-7)3s)_Yw {tV=g%YJ;:RVjپp0揮//ZD2.2e@ lf!zYNza^s=kxa PI^`vQ&0ږ(FIH{^YIVUT_L<5Bk4Ey]E_'QQLjQ;kz6&B .")&J(ݹxABP̍=D'MaC;9/})t+0B%{V{V*FX]Yo@kmϲ&sW_䦲O:nYɲ( e/Hx}! Dv̘"ypT͉2V=$Xu>x?裵u]G2IyHY[gWub:s<œ6ACA k` k]v}^n%viŸ j7cAmQ3M3^b)n*ڡcKVATA)?Ã/)Vk@1 %uFƚE*+#c_ՂL%2֩EgFHƱ b/"ƈ;"vDlVl@KW,oTL2kD)F6r(!z#-vi]FiZw7O5:ΟguUՍ;wk?Q{=+nt^wu7zݍ^w I@Fwu7z${=FnF-˽-K%G}wG}ݣ{wQ=G}ڣ{wQ {wQ=G}wۣ{wQ_"gR֘|1E&dV@C!Q^ky#sEcIǒ{\{kr߹1qł9L.yş:r$РrLBezùcv'S:nDf,I{ʲ޷Tz9ƕr/I)j[@&yEp\Wm&gheT(^md1CL0u4wW=zcqEyF7Vˣ:IiJI)ĪeXQ2Ya Q $eDNX6͉6o7+hCBֱntpi9K p~aNZ 절^!Q50p=89V]AVeiNP1'L0u9E34I\TrTeDQ#2F*D1&`A H[X`#I(tD GEjGJ1 (|_ɵ)̫OwW SWw3Jh5 "ةϑ[N6>b~0\ h}1PcH$ی8xJF:q{>W"AJ˨(S6*H ޖ hg4 KM{X&,zxo$Weem:^{^o}=gj倛xZ3~O>:<D((bu0=_*D3;I@.2b̾w_]j_WsSx(ϗH{pd{ P牫 5 ˌ&ϓ{(_^$BdSNB 3=ĹQ0u _|aY//d??.>$|Ώ{|7,Yf{yɛaoKw:;;waq_ܵs}3~f>:o%| Ɠ7;Q&˓{WOR:YG"͵QZHԆI$,eH !0H>ukg%HaHJZ Fv),UTƋ Dul%>XR#Cނѷ45V⑶*ӾDŊ6?g7.d;EA*!E F5nK 'ye~1h3a*@*0&_! QLJmdDA,&XeE6&,}46Og^O}?qaz,?cBjNj~aE316z%޳pDsy1;0t`80dIPyz E/12cbWP)K&WR9 KZVN.H @NQAڥ8ہ"0`1"_W)3xNӽͰ[lO_/Qݮ\|]<2+Oѹx X +- yU%$[RXx9V g;ic-tykuHHSuw-QTk(WozM^BHшs' if`πlz=:ODdJ+&C* e8H**6@\r}m 1/%al\pBw^e|&SW.IHSY$MtQRvȠIޅL QҢzbV5n7zjIVQAۛT_'_Tm1JINP҇>]钢t6z$6QH՜Z۩QaUn3Y ' Nj?D*V}R>;.n%#ޞUK2%D4Ty:7ieI*_s[FK2d`KfN)T fn% Ei%L|b bBy}9+ym*YxM-kJ+oo&q]:|Ć$\'xN6J!EkR)x H6iTTE HtM Ak34wt14{tg.[6'X\( Rk|Wm$Iv~3vf7 vw_&Y,̵1qc4BD9F/(UZZaO a'EEv}]!KY]ẔU6DǁXKPyIRG;Z㉍͖#'~@筈BJZ\LrO,g),zc(A Nڋm7Tf_k,ɖONo `NTVvD m$VL4"NE։G{1c.5 qȦҡZ=AoŰ_݄.K{F>z\2%d%)DJ2'JW^%L1**VI%Fn=j(k JpuBb` &ծthHZ(.3xibT5w)q!X4p؀g*3 J`i%YRG\$H6{GD|'H &1}yzlppq.PB1ES7NFkq⧫Z-`+ U=e;t'td89:z_}c D "-(H  NoNtrɛtnLCQ"ۭ\Cb_I/ۑ+ ?ս܎Jy7jI.͚e)=)Zj{D'W Nluj @]CD&'Wt! ^V_8tA˫BZ7L{6og3o/&jvw]LnFq|yjp$4 Xe^yiw;I ?^?N[Mؙ3dC^?x8oKY" S%*ř)Y;_(|9Ҿ~1_kAE!&<` "RFXF`꪿hx5߶GÏմx -vꙵd+e$]|.X\~/wF;IyFSubrY՞9\am;;u6%7}:Ʊ%/׶VOcPRP|ۅc1L3vߐD4lPesjHJ[KuFߛHǫ1WB PxLl<*ڤ!JKNjK &!:2LEFd:gD ZƕkeURJ PR|oyLJԻ YZ+AOҠ7D6$҉^o7Fw祬fn~\+p ^p5= Z˛VIKB^zi+,童J'8Yz!a'tz{zknv;_<ӓB8Os\*UYsUicw[f^vgl vf~tpk.Eվ62]llsEzc7~y> Ę!ަfjɚҶkh4wx_1MPvU7{2{C%Uo|yq}MM,9c:MTo4d[L_ެuaSʹ|!6  J;me7w@v lZvMZ"Èy+UZᮚ9ڨeIܚDbe*#LV0JQ4 RMt!WcI.Z}iD%r֥Aiжh+8"uE/]]rgF填BTfԕZs)Jg LY]=\nG-QԕzRt깵J#RWDǣ FWϣB❺0"0G Ǣ J*Tx)|zA(v~{ٔ9W%U=eEI([BETʀsX&S'oNqewq3QtG#"&UlVZ#$ccm5E铺xWبׯH$J_=vZkf.yi!݂@ňܮ*#^\OJm?JdRU0*tWb~{GۥC .^w]BCCHAc0:J#%FFng:e-`# &˲:.|.u),/eTq1+Dz8 sk YxХRcn{~k< :o+=՘8#4^b ×Ta#إ? #4u):_>}nc=!D8ً4ԫזI[]睴gr{ ==жۡhI2(ast*gȱK4T9"P`+(4t*ټ*SAe]^opu]:$ V@Nbfࣃ{ >|Vr[ԹNC)4oi3=>$]_.;{l-fE}Tw[`WdtJ.%% gnS9#&.2!Gw^i/Գ 6*?nkn)zHՂ%4d43+*- N?zkO).\ڤ%Z8A 6~B(}"6$ g˧<$$7Zc2z m7~jE4h7PrȮQ){}{(M3[㓍f.Cfn&ۺ ,ZWl''u&_(xsOܩk(cY jU+?]u2!9 ʜF"eL`#ukK= A9=+1zR z]>:E8%LJm9OkQ^tak.\LhCn5uᜢn\Q9Y*w[^y8L><7P@8ܭ EP){BmLQS̄FyKHŇ ʆ[`i`Q&(z"G u&w *YЦyZc .8[Cڢ} ,1b"ՖYcE#jDhY#B;(osUh gLi$8In"1g<DkH&jpc(ZֈFw]h5;]钃v k|;ZS=9ř⚅:5n.UnOrFn MGb]Gd;GêqmœW1EtŴaBe,?#IEiu 6o[D?9>?Zf[^tS̢dRi? s$Hk!I>Hf@\Gx)拯:hvZF6[nCm)g< lސ+K@(>%"# BT `GACeZl|'^Xׄ ( UsX$N,!\ky_XڤUb/4"HE1Y "CR`0U)`:b/H cjb2VDll႗0P R A5W|ݡ9ХgJNg;Mf۹%t}:nhZc i,tx>N;IeF?mNkwN\ߌg%]6_ZŬh6j:䟥sT ۫~?Q|_7W׳؊;SM;aTpƅ<'T?5ݚf<[K~=n^mY> Z@ŶbX;;a̬R;c^[G9ap;ro?If=2EuPU@kg_Gx{s1_ _ڰZ7RhZNgJRH-ɵ89wVk|/LoxM:ɗ[f3#b$\S eFɵ,LKr!>oẈw9<;T\_ieklr;}h?,[右,ss9u97~lr(=u>5HlMĞp%^3L!L[^ӛ065+na]nyǵdO3“eo "36 VF"Gib5蜍JNÓicGg.:"(~9TŖɢn !C0ReݩSlϼ#x{v ^;毵cn7h 'LjVU;7c*}o( P^ckj9)Dh {4oc-hӲo&:} V/=5]K"XH`"T!fX,Ṟv.`A>vjek`$ &DB,:fǺ%>i>oS0Q:60n]sYU#j`/m̵ ׫^Ա~ ikhۯUy_.1JƜ֩gnpqMݿi 10MȐ\0w:_B$Ui-sJ@*REd kҺhx-ϝ"gk#_LTk+V`:ǟG+|׾џlGw`;!E5lH5^L toz8:K)}s"|gнgª@r j7H*@Vc\MJEmqɈZ[oLZTGEXK; SŸ㰦?.vo:]޷}+?\zY+ovLH|(R AZB%1tFdqϢ*/ٸW bx^D$kA&;ERFi[;1 hp{02%]*dT:@rw&GX@uhCf] #Akn'ps8.xn6mxxYbD&#v  $F8\ƻR{77G<_,67MhQX{y0SfƋg&a@>g\O碶>3IE`tP$xL:<.y7<Kr`bN`pT"_=|,?RZɷ'YRȶ,bh$CZ^*j~9AB|hE88=}M5;ruA (.AbH(")9d!d&#39m:D6> JW9ysT(;G\z)k;ƅGp׆T3^)Iol\ӛɗ6&E9t H <5җFK_sR=T.yt}4=+ԦL+G*&t:+]d zpQ9oR$1Fd I[>T!IQfd $}ȨIb" oե9Y]:sH`ɴdύe?+QR~~>=ɬ>KK91.wu3 bzJx5hmdx>UOA 2 CRQ!"R}+} 8HZ6 ٖA1$\񂌯;AIH4FZkY;g/2dpmYdk#(5 1r*CE!{tԷ+rGUS)I 7^_)~ L[al׺(>˜$uDLRD6 "UtIq ](l ) T/Iew&^ EƆlm m23XĺsP WHTEm8)l9Kb gЖ(!;omZ`Yg(gמ=%FAGT%$-uJrP(-y*Z+Kˈ]^=o*XR2Z-D:cYHDW5P3E7IQ5$-Y^!BD2Tl'YKe`c LXqTh;銼"<"Fv1S YȪZvT HI$ҁgD(C[2TM/myDInj3_g,X3 b贱CҊP +.\ܬ#84? @{ hţ;Xec8a;-:Aoua%ݦ4eVL/Kwƞ@F,` 61{^˓N?Jb /|oOviT-墥 ͔ի "g,KUڕzeFl|w6fyph =Ƌ[I|z@5!/>|n ^:nk:<;~_ fry{>-.xQ4v2Δ_`? fiܷw4(aHE!.2}l|"xNG .hcl0aI\ݱ%<(2;^kOC҃leӱ0<#DӅ1eDd; *1 Be1;P>D@L.eǝ.tqfIYnP͗yp2KJ̃jk\U+[d0X4?V g7m]~n~}+ߖбVKn,O[NG>(GkrqҲ\׊9{o;- ,`bd*z6RNm>O;[Ь/o nVX)r0[O[+G%U-u >sh ]$y-@O$Dԑj+͂$1)"u ^vDa8c#W:r(gT-֘|iS/5# f1+#QAVޑ(b(:%+upQ`HY3/ۍ{:M˴#ju˕<{5fl}J h³7,e^X6&ўwQ3de%H&*{C]yf7Pނ/gϤ %4Z/l :)L˜ᣇ 5nz_r79w޽&@ lG|?g?O*tAf=߬?bV3V.XRO-:tx|59Zݘ!9 Pŋ(Я/ 5c).eeK $-;;3|>_ftu Lg_͹ T2th4}!7;c_e3 y;Q_ȣh_U+pP}3n<<]/ts毯o ՟/]'@Y<Pr>jPEp-s2 $C (-dT-"i w43A##MvF2k(p{H6BEem*[7_>~j!nw0zo|4J1|օ@=aܥh = :"h;*!uMPa!J6me"&iz{WJįO `Vy@b^jrZO/^"kb!D91$U9{W.Yh]Tےvw> ɊB֒|>'PvGK/Qx:&%+QI2JqH1$b6+@8QZmdWLlaD6+9i`[v0r(^m>B$״Ͱ*姼'P"X':nѥg͔N8hz?fȁ=t)p=9I И"s.GҲ=s(q<%9j&E yi$2\6*/)R:lAsT928kf?r[u> 𤳔3YBҙEԑvZaD  b \Fh8h\8Q`8ɌqA?ҩTm]dtI F_;&w\BUjAJ6|ǬO0 e TZ?&(\pC t&*iNKp œQ7bVhU2N[T" ]`Cɩ=lm 5>6UؘAeH*%tD  6d3)G6e^ #@Z9wud4}"jCmmD5+S}LJӛ-ю3:-Ia2t?M"eK ^ţI2̀9G(o:KϳZ^ٝy6',. *`G'c@V*=gA܏cP̸74ʫv~:7a}$?(GOgW{5o);DˇTۖ"|6iUmtИ*NlA\ QJѨ,7.4'ʧ +HSXY (YB_HYee^E55yV|_s`ޫs9wY &wJǬzFYM]h\fR}^MFht᪒kyz=O5,.uo9`*Uv&}2ۻE*X-[¦0;}ޥ?55-i\wM_oR[GeOc4wGp(>\}Լ}kB2<CTU4 ?&u~^箬|xFVaN"V/fBj݉w7j-uzf3ofs>SnwV,% IFT]ˌ'ײ0^G{ %Moi̳gү+܅^/+6,3*cjMt& 2[S9md{UnT_p ľ1ґy?ܕ{3ZV0`\] }[gf[$?*QG &{l'd=ׅ'BBd|h5!%4BZbLQkDF!&S|:7x8u|wuXS| x!_EM(Bk[Dg1с"7RoAܩg3O\~NՋyr4GԖJ >>H \3f>xW(EE6D 9-ȎN& O'ۼRd&5 s P)(ټgȑb@!5D9A!⤓M]}erfFX {:~f1 \o}(ͺeUޖWUplБAN|:<4}%݅ vo]wwLIFxuuuX{{~ٞMk;|R%DΛ;~:ݱݱ͇{>j`H<8]d͉'w#f.]%Vm㫴-fm?m>2ِkMd;zK )K:RRj-R,_RggL<zs_ '(kQ$HTX # Tj [V*jF.h;eZuG,3bEsQ䃍-ބ|n7j4׫S=٥ҋ/ף=M+:QbE 7)Ŕ6%X\P4pPyyM14 (Q#ّAهOW.nwdS4`f̌OLL$ړx'ݏ,EyT\xb9«-L2כ|x^2o`eul?; 8R5|y,\>QZF7׻ۉVK'[DL$-,Aa˵5Xʐd 'ggTo/9so/xλ2:E2^KM! X"]o9rWB>nK"{n3"Y,hg$))<45HjY- H{8"YUY4%Q@Ah0ePz(ʁ6o+eGvymǸGp]1j$ח=C<sCNNKTђ|*1;E)^Stz|ni]K_VMǼ Hk RԺ#ua{[^!`Hi04R ) P! CJAΆCJ!`Hi04R ) ;iHi04R )  ) CJ!`Hi04R;4 !ΝMVjjQ_oG($ 4& (%фOI{"P@+oU2P FX³,΄C+'CVac>E6&m2I$Jā>$G LY.Ȓ5*튜 _h>{==߲,zSttP¢SZ/֭HY#Qz#j_7"} #鋷b~&j\kLg I9K.K畃 [l{FK! Ym8qwǖjҺ7u u'd{^0ȆjhX2l!d,MZ $??ۿִ NlA\d QJ^,r0.4R u+H9K9fcEf.x:ɇB*+,#+K'E00H|ŧן)cv<a,cl֍{}P۽M\(}mqGcd^KҫќJ3N!űIeM~ڭ:\t;;?>.M?{{B&P$O,WM:L}N?“ÅJNqa|T{/;Ym,xڞ{AJޏ:6Tiլ0Fut|:ʩI[O_d4 ¢B3b=CYELjǘNt<b,fkuo_]z6WZzPRk>ǣS>,?[f'!: H eFדkal%7Xr>TI{+}?OS̳2[]>E;ioV䏃Yh^kVKnCC3 ʡs`ufSp!@uijjAQscΓ0NyOo>^_aQԬ+^0nD~scT[2 G'p`^:<>_6?ގ$~gjF'K>Sfox[îL׾(ګn5܊f\l[lWЭ『B܅ݡtڰBe ֩&3<$gIG2G2t#@D1Z8] Z&M9S2( YEk9u ]ME兏^9aATE\d%bp,#dx =Advmۘk(AaWE-̜Wv}?ߡu>-6{K]x'm[c\UpˮDH|>Q{$cB7w]{篙 Ibnu;ںY{Nmnq6e-%MEo|iw7ZV|y1]x~:)]^]|3照h>ye7\Ys5U]veM_v o٭~{v`ߴlsmmdCĵk4XNHpMQ6j_lgF2Rz蝥{ɮ餵p0\~#k~D sO& ŐB PejLXU.lYM6.ajo}٤}EKjDM*X&鳰Ix"B',bjgR((VҖӄe9ɪͨӸﯮ_:ޯmun9&Zx JGPdT>RL9i* õ ŋA#i3"y:<c:|y4kg/E2YOlĀ (gKi}RQL{GB ehEfw] n#Nkno'p;.xe+ ,ξuPh; %Fl2nCKu>p 1AwƯܑ^/ҿ }x9-|swss:_T=Mu7L̜# 7nqIܺcGg㳥] ʱuIg3nwK^lwV^uqg nJDkŗ˙[me|s=\<-d% "G#1IP ZrMk 2d*F'Y;ko^ý.(^wN7e4E2AKML-AQFMid9Pd|0L^'Jr`*MruvQ^1.콮z\W e|T\[Z_}aP_(MgUb!/쵿-J&opiᩨٜL+ٿ"iNH?03 ʇu2gr=gگgEfJe{?h4 G SFQoNwy)7-.Ǔѷwӥ$hkHd*0PB<5ah8fK-N"@\;󔷮&ш˖)D&$ lFi ̝ͲT%A(R"nf'lC'Tc:s/@Eg}NC#,E G^͜rr"{Xc i=$.oArdd4aٶcvl?o?/'fb{qbJ`f$sO^k+$L=^zėaz(!1Mъ)46:bnJcy\{Sk]hT_}J-hסAZ5kQʕ fݖG߹^ޏ`Q~bOw{ W[ZJ:[}xk_O`F-n-Krf7pw^%vrU;Y0Zi~{D8~i]юvLDL KRJ[js3W [)+ 2L53adU')u̱+꘣UHB0 ܂ȌR:,j`CwJ)]9xf_zN,*c?n|PX|FRHP 3Y9JȨ9X㙳11r GK_ՙs$f*ص`-;X+,vŽYo;4 p ͈k֢XrXța7IG-v@um,כm _2{ QQ=!'840^BjRK)l&~9+'jD#%(JVH:2l,A?/А@euYw$A8,HT'T`$7م 0xrI9pœ~pFyL hJ5fӷJ'pJw%[Zr1Xo32[&9fE0GOd%`XJ%:D%7`Ph T,ٝcTY,X29bd9:HsleqcPAIE*j<>0KAhǵ_]m γ[&Z,{6.FtЇSZx[Wf7O% V4`u#;/' v'A`^:x_xhE!Sj4 yA kIYDKKR d, ת#P R<+M6fuYKr s! FvRs5sGQMr|?rI[>/#w G{K`^!?N}`\|ԝΪq:lѰJ>:y"iOi\ןV7 )i *75%c9*)Xd}U=]Ϊ&Z39]IɥUD=rRyob&sȑg s[n)٬^J6b&'!{rIQq0fϬpRIp㭷(xRLs2Q% u}d f&E!@+^~ɉC) Hy1 Ceo9Oq4+%/nT2ͮFqҖȃTb=P}FhLNǣ_{䓦5wC{ROĩ_@QD10%"j"n#8vJ ƍ:4tUdhr{VDc.zR(u M֋RĜ3)jqR9#vr,62#5p] J&%A?|vqz證;64vCm1Ih L(FVP"=6fw)*"fBsS!(u[cݝ,1E{&hd6LCA:Y0X{g9#sԮ6:ڂڢCn&, tDk,yEi4 s 1 8H-"xpYKt*eY1C.ɋ aML,hMđ0^pŃe7qE<8NۂǾ:DqGW4ɜ.#y 9.u9cH򍓒m!Fy'ց2@2!9O( u=⸙O6:}qQUEb[#Z<|$EKi'T9YVC0!p.NLHG!p5Zc_<ԕP.Oh4,ۭL7="1{T\w4! A ]1Ws+ +vtFgWxgsG?XkpEV3\BBi3+T<ͳbfWd-gb%Ujד~w`n ׅܠA|KjO92750e԰+>]JE1MъCc~{~udƩUMCCoP7tyuE'VMye>_\8G?Q9rI\FfmP%/I^6Lu\|O)NFua;yv= ^Kc3k7YX31$k]AHi_q)^Lsv>b.7k+<2͔' 6f.ij'k0YU w+j߮:'"gW\`WʶWBM3bgWd.2}.pUpU++BBm1^?~|?U4c_oh~#u埳,FU]M]RԂȑsK=9"jn<@sN&>Y*+-?X=-hi'Z5KQn|SY7q@ZkJuoVR uZ{ >e61x[F0B 2d*MLy;kAnJY`e,DBwiiC+ q<K@^AoB-[Z c&VMy*ٛ~7 > Y?Wmv`B=E~ `/P?3G,-3m-Cwفiik[p} cž]̕\bO>.V.buN+r&>c[js3W SVf64629h<h΄ߒU ϡ^St ]uI0<` ʆDdFh ETi:08ZaYop̉1V>;ϯbyQ`i9l}=M,*DW#F霉2Xr춾l}=q/`f=Q-v@um,כmm?enzz3p3Nw1fc) Ttz|BCM{g x?LҎRR S.\ЄǓ0rݽףKn dʠ(S2FR m8& \G֐O °;m=vz}}Ol;-NMr`Mf$LG<΁)9!yB+EQuUwAZIL8\h,H3V 'Ercd:0t.3[y9NyȻ\_i߅fnfOKn|mzB+9]O.f,s˴oF;? `Urm0(*prLN1d 38X29bd9:HsleqcPAIES(c5sky ǣp7qll,s/#3k:2Wza36qωow>Ŭf,OYwq Wl|YElrC"!@rƂXΌ)5ZMmDߑuh?.ὄ-R=|-%99$D :YrOQ =LR@^XSxT*MHʢ-P,DQz\**F.w 6#g?<8pnyߕn#猷%>ݎn#8Bg> *T Fb2Ur=LLV/T1e2:Sś~Ywh~{&(H$.H<pj̾J+\pd(QZ&b,:QQJ]`L'ov4KM=,8:MV WV'rۿ\t>{x݇cvV,zK1z{|MC:sDrpSΣhi:)D@dMQ(G:b*"gEe(1xQg3rLWeik ;A/!5J_*E&%_;%+UH4c.O!6JP|"S濍e *DYH(r=hrD.5fT:Z K?dzi׭c֌OJJw SܭJj_e\xl)Tb;ʹ~y,64b_r,jM))-:Wk{ 0&ۤJv\RM5uXr.1Z /U%&`.>GQAJj_3*ta38R^Oyj ZuၺpKQeE; NI]>?"Wp~>t~X~ՌSPFoJJd&L00SV:=#q8FlIu:1Tg/d%tJ6d!29|¾p(VmITDBz95vM5s(Zw EkQk>W,QѭPE{ #(\"JNEw! >`Hљg02^t҄k2d),GQ| T2iGZWhF~};U1E#6c5nGx/ǻPJVZcٓZ9u/zN}c2/=̒W:DM.%(!~ 5b3rUG֋n-\Cu6ciaԋ^7%  %IYIMMtZ$ C3z1`❭X}hC{>|xp\ϋ%}mhXɅ4UݹIpcd?2`ycoJ\-ڤ6_`f@w٪؁yRoG˸[A; )e^gI"GC Q!"/o٬ϊH8PY\$kctmQikB.t+r^RR.6*v,UO.)}nܨnc<;MsggbFLxx>Wύ!FC!ԡlR!CHc1;Âop"`(HZbȌ kP>Q. ,5mdRG &#5q%o "Ju [֑,+K@dZ֋9 1u萊coؾڹsz:)Îfպvոc}!0,U9$hZK9QNtbMRdf*c0T%EBPI]r6d RR6܎Qͼv$x7 8SV+J+dF&nD (!z-w1c@謍EY`9kF΁r߀:;QSAL9%rtHxR@^iL,0Vǩh,5H!>"R>~+VgͱB2Z+k[$IjmcN,dP\r)Qj5Ziy>Ia=Ȗ $IdR1Z{\߄JYEI$2a2*6cPntAWY Uեpg0R$Ii(9>6FfvrNǎ_pԷbc4ͺ!jc=U*|RIWTpe PF)lRP bvcêӊCX:\TÚqCM7J5d0&HuBL`Eq*ag" Z,׶1iAH*%:: \ڐͤG6d1%{[ lF΁%\"7;^ˍ% dUA(bOnE#&FБ AIKf@D$ؗŦ/~@/-ywůmN -. )=vA6U kgVMSZۖ(7d^0{z- ɾ?%>S\a9˶Ii2MRQ"f "}dRv: HeF*K lOO)V"efdB*+,#ZKZXVA*>;槌90'nBZ06A=Y!p㷓?/jlɂK8; Uؤ_=_M;|eʷ^]~J&?C`(h5ե/VWay> j*lQ >Po]x0f1^UXGz;r5rJSߕ ͉o:Cz_Udi\/lZo'}+W䔢.OZFuX,Cr%T;%!CС.B+T GvY,rX3PH` QNPŦ8d+mlQEGBPWg+\|TDRBM"JZҐ7I]VF_Mz7{nӇҬRN0=:BeZׯk}RH ";<7@+ ^|JC~|3cXeGQ#Ff"X_K|,1ÿt?èE<[@aT:sirX)i^e)2嬷w7CNVk0PC|?H]zrYs[5aS놵ϫC'3U7ؼɪ5gy9.V/0?W .W1v+ȼqyv9ߒz1}d<߽_ӟW߬;SӢ$B6eq/lsȣ8%7bIu!ttAA}R:k4L J(Z.%^KԑZ!) /G:b*"gEe(1x{hJcG9.d{7 ȇLomM9/6{^[çn{I[Z FwvunTK;t;ݭF?=?bϤ1k4 yd#7WOvVa>3W|4x'3qv9(v:ٲ7Cn{MjS٬~ ?K;sCJ?4ᏽhWx(R`t:{+ zԿSOSwblnr9=4L{#?W{N ̵z}i#/LX(> *ƄJÚqdوY׸\<%5*)\Evcy%g,{ld2m`<ժ:j%'0Qs29[bcjjFSy~^OGvӭ?;7[Ԧ+W_pE~p=e>??3ST&m)]m@0T=sZjNW Mxf<3!kC! *Pʚh[6J-V@'VVʩMy]tAcJxM*kK$N ^1!thUR5ɗ#,7_Mj!f=C#C R;;x)N|^a˅쵧@E-y t.J.XfI#Ua~^0~kt (W8&O'տwwVa6˿,؄y]Ht$ѝO|p6? `u_ [7ܠ[(ӯN&u+MVFNۖjS<{!ў&]r RTV'kA;sdLB0.|E ѡU M5'Zԏ PwBkS%/]u=uQޤLߤ!In~8*HA}:K&Ywr\~R˻w7?-7yۺMٗ rfx=l:l/O)m7.|=yr2ER_5msoFI~0<l?cS]=Dc1 SnC +omR g8Fv~i_]uL5M(m作[]nqނ0g,KbX!UˬT␋gTϪOkte ]|4 {#YoeT>-o#lڰ/0z5X6j cjr9#}pq&a8ay[`h&hc&&4e^"ώ_}.i86PI`Sf#2dD@ sUR8v Bi V67<&=Cg6ޟw}Kl7j]OE2SZ8-2X!b|~vR;0)Fv *[a[A{L—i> /p!ibu[~6)~_NLa(( Q~?Hf*|6?zqv:/m2Ɇ$i ~@v4[{V&YW'2L}>0\E@aҶ*\/yqҫU&_+Ҿ%xѢIY5%:U,TlE|d(aQTs @6FSh%EUbnlW}GN|ۘD#hDF[B-/vtnc:?8f69y|rjM|`J]&8lȵݥ"(miRжJ" 5@"1qQ*0B wnky+lv^7w_n reyG^1e/Y:-42[# l=6.(?(Oj@@(hoVx (.qI׎=a%ٱ$];ڲyUOmp)%dr$Yk `Ew:\ ;R<)%D^Ndc:lz7cd-J'KT Khٶ+GBNYqAdE&τSv Q5;5ػI&`|=ɩ=keoSp!Y!r43 F >T$X~׿+xH%}N ə yЬCʗjr2" y+=k pɪZ}!TԮbu(IQIG(U;͜mS#vxIw^-J"֟3,~=f<}60 ? S|8X>ىܤ_mOAx1H>L->H8Q7VZijL- =Pr&jJ@,t$aVk[~b]X p_^f¶[>~=i7q c/C1rڨO'? 'C:=^mA] N>u`*OR9 lTr|Sujɖlm֤uЬJs̈́50iNkCMrWKRFp$J5Ĝ8Ơ6U|P9ufRq'di61-g|Bi:Ԩat@1Z]tTbVHFv5b+nu-6IQ7>dj轢9PsrXhx|Xtd']MVx]#rNPҠ{C||Y.^t9SgKPe+*߿I%Igm'2ZPX(1%C^K⒒늌|͹ƨZ=\ADs dU+/ª HFn܏J7,;bu A,ux-ʢvm3PͥmOHo̾/_9b@x (cVɚJڔqKB̌bS* cKo 6dQ`OMȾJn(T#TJz/u2~Ď|- jw󎱠P{bwyZeGwT>H&TH4撬RQ˗Ab$W$!uÖ rd[+Pd E%g8 F 2ܖxͽ92lрq,?vED茈0!℈wz]5 Zŷy# ܋e}촔Q\r_7_L1Q_@1 M_f.lZ3Jo/5\sr9]ST\.8X?54 <,io-!2Zܸ±wM!o=PTb|md%JClZtlYQ~O)'G#꒲FG@P Jc)bPAIuv͜-tc*&{;帖i1֙}k]T6B^h)&Ki>MJ-'jՉKsICȶT*)DŹjF6TqUS1:dy"xDTr/.B[ڢ6*¤b'y%$R̞}[Ue J#Mxe*`x8j  "Oo/,>:κe-+rTy9biR pR+hc *XF29%j|a&_z ~awmY9\DpU{{F# O#%`qNbgIrY,ID\Z ߯zfHQIDf]i9͚zDʹthBF )43Nq٩ZZ~>NZ2ag 8O!Z/|EfIP2iˢL Hg *hGpTh+/U5F=W:$!ɌvAJqdz`)^*:nk#]O#)p'-ٗD̷р=U\H@YϙOL@Y?cB49[BYzS* :nhE㷽Yb6菲mI d>Cw/=h"BY`Vj_T)@,[Q]2g~5dsl\,?ݧ|߱ijHb9Hd\$0{ OLUZ +!!^Bt׃a:LQޗ~_$4.6gelb}^/`.f)ͦ7sw$f9 NFۊ(תrW,S;+7=L:CS?QGg=!\Yft%2psO}LGrG䧱 H$g|&r>,J;If>ej0 OMh$+XCq+?`{W 78{S}ĞP]g] a?_^\X sfmgl=!#xN>=I=/,۾_PAًk/WR[dW~p촣7ҎS+^E_E `P Tԁ@*VȆ$8`CϖD\9Is-%$9QZ+o8<=P晰"x) $`y>0떉.af>f]AWɝ2.bb"tzg)xm]rytQ s4{rI;W# }nJ MxwwsnB0c#WUh_\k%i.x3؅5}75ViU/-6#܀oF+|-9ܣohG`+BID;ZA >}YR7gQKsv2_GTsys4q֢( ֠}τd_&'Ee li*A H$n|C*m%19&Qf$[\)mt'~yoz,>zTPITWSYG,Z,P-Z "hI$x'ޓ`h:Mc ,yLԡ-W%,E!y$!Fs4´- >@LH<'ƀfg"aRX|rT@ dZZ w@73EJ n3Rv1;&52 ~;NW6ŧ&>۳ro->Og+5"H 4h_P K]3Apj|K͔g.B#&ecJ f]JԚ.0D0ň*PvI@ZXhSL1.5.'ZZ(.AԲ:MK|H=St|\ZKǻmC~U].V,1Xi]-nQZ+0UyU%WH_DȻr7}맼,+*~<- :j'U Ãyɕ_W s*7,q̷堌Q;Yj6Jy̕8N2sW506"G<PYbu;C,wYZXZD[9yb ')y`NE]6BtM.7%ww58 E9*̊>*,DXEѕG/ju bi,Ţ_m=#Q fz"@T\=[D\=KJ\1\\Uc;q#31G$2¯p_Ԗj}, DXN\GqŘ"BQ:q}'b+^XJىw(8'pD xUFcW[~URN\Gq Ye2},Mt俋w &0;ۖ;a&MD<#gй x Ȅۑi<~O?o9o;oV"یOsO$P)u0KϞ@2@Do NyWxg ѐIL|f'u臓=toāpbd9O vmK.vCmdK8{-mJOnڧ٪̶kM'6DB-ME Θ dE?e%){ {=,>L %-NzW2QT;Y,}`Ju[P\'X'Njc,a5ɥRηut|:gbl HK:D[LBڻ($tp Ӎ.۷ tXM}MŻCw m5n{#o^ȾZד[}_I :*>Fxu'!kw>G^Jv3_ }X>͒^C+gMK^K +V^TXzwXmva-LQ軇:z!gI]jjƶs`;uѧϦ d03 |7PXҡxipa1 xlzbj-݇%k"t%9G(G j lIX)1)(}BVF6KgkA#ۇL*Zbo:A"*$faG6:#gX$\MSpνn؆mGiн.Sn|N@G ?ُMN%M MـNE*aNB(E ` :n䙼4%b ޒzd  |$ca5tN),TaB9%Q d<5oZ[ul/Ʀh vEoYei/pzfm%'ۗo#0%oK7rB2}`kgc ˟l̚T/AF2^4JV|RYFV*Ⱦ42т: *^\8m8M=6$o"61DՈ?M"pOg>w>t(A'WC) GEe*Ee(͟urH I@Yc+( #"(="btW9.]ꌜ.Z蘚mkq͔9j eWm7S~mս7IK|?xGEʺ&HyTu)קN}TFd(eRcљRBE$f{&h`ɮĂz d HTEemTos/?]^͹ࣇ/1m>szȗO4!QO6it81m ?^u:"h*!5MiP`!J6Q:ѡNǃ:{ԩzbdRze21UE:_c)tDȚ G(G:b*"g%Pb:xv:;#gDŽ*{V)?ή>z=bt1iߓ5R;K^hdѣd*$㐊3c.I@|m^>p^ o,'25LA%" R&G3r *OZG1Rb&`Sج͖z,CY^x, n+*"t[֭;MnLUIU^` ;'NejK^S%ҩV~b  Yγ:RK0*EUO=VǁQ`mR$.)N5EWr.1ZoU%&Ѹ\|T90*#FUV) Jk95cwX3]g t pG]mv]Rq&_L6O;ũ|@tklRV3NBU2*Y*Qpw,dʆmTbJmAFlItktj1Tc/d%tJ6uBdrXe%QaסٮxǢ;cڪ=;J53h<[BdK@E.%LyEBj  冑&Ǻ&HVd8B s O#ٮ[~Yܺ*cшǾQwu{1!dˡ"3[V+T X08L=*E#F߭ӣ:`ۘ he 1䕪g"bєNĖjiڎ5bg!=񫎬BMzzō-YL$H|1Uf+Ibfi'65Ui*Z"]wmVg.kz3eGn0k.3զQ̫JmO_(R?ذ7Ó6SP4A7%"e3cܓf2EX2e}jWTxjMv^Phr\hrZDO l LJ;+En#W>DVMJ~:-sִAَtQxPÃ;>18tb&KH:hڠTB+hBARv"S]_Oe|O7>W~j߫:]/dtyKD]ZڑeM2Ql^iLT%9[Y2gZVhU2NQ=, e(zxgGr*C&Gf f8'TJd >A 0jmfR ;xd^c Ҧߜ}uZ)`0Zv].D\xp|~ܛ1\󦳚rNY%,kB)k\`d萰.M dt-2[1M^OaȭkEۜ2$pMrš=IX"( զJKq_+-C1ӈe;+Ma/uѓUs&ud ~BhySBjDr&zތRϵ5K1[ #CR4`0G**)o"M)^}j)l|۬/29"k5V|CMK Ԇ%,q'kùGeV7hZ3^mRY5Bi'7W|7Ŧӏi=Xo5g0? (>M[a{vy5ϼgi'8|>I/=[Y-m,ո{EJ Z@]b[|9;aՌbnԻyT9Oms*4!ރ( [ CT*$dzG xp|;kFp"m+՞BjNɹ'.d:=򄆾Y\[f'`A-u\HB0*Z 0^f8v٪%7Z/o[J7 ?o2y6V_ievD~\v?6l8è7-:dr(ztw];ϜMM-=-C^2ebmi&5zavChRh=0\V-kwj#4f=<'/<"36 FA)9ҊcZ&2 19ݹ'ύwJcG3*8L"(~5 Ql5@lDoJ߻S;qY̼#x1󽝪>`ѫj1{÷@zAc0O1.jX@i|@l&&! hiٷxݐ}nHwoxV` BC B+T G6Y,rX3PH` QNPc8`Mi_85GT.m^t7Ou+{;9V}ZnmB߶[hm⫁WJIo>Vb~k_x(_㛷ɟoׅEOi', ~;fKmr=3>_~קQԾZSiT;$\L)wDJ[9k͎{䬜?ok9o~XXj?Ο^>.޷׏Lk+;w]v]go& 擟-VZ`WBWd4+3৷?/2Vɥ5A-907nדWG˓>77&?S2bG"Kᚤl %ՠF6 w,~EtE\ԿJD i8R~nϸ[@k=V?C5=VE|= RAVcUqd2Zvg- |QyT1kSǫrf'Շzt_#;~?:ԕ=\uѤY+ Z!-rPI)FKAI4F()F8F`^1J"J,brTd)h^b1H%h("y0u\O:(ڦ)PPysZN .JI@:V C6^:#J?NɧvYcO5OT ibAӞﭵ@ ?{WG1v[$R57ؽ;,pX܇@9}v2_Tu۱iIyReUMR%aMZfWsҾrlP͎Y.|t=y! *Pʚh[4J-V@\Ov[S=O:Eg zqcꭾJxM*kK$N ^1!th,dKw MGc -vph`)N^g$/d'É7^6M7]bJ *P !XY92-)Pl)Oި &5/w0_TCveK&$]cd'.J.XfI#UaժPߪdsϗt Y]ޗlR+jdB}}M7o5 Ei3ҥiߖ-正yqˋMJ/gc˼"^ucΪ_~Ůэ;pt蛒'G$匿e޽QFEo]i׈1/xaiǀŘwwxt~ ז]o-kX=C~ғ[Ywa}soȅ/qBGOOEe[&ݰWi+^eS?V˳{4ej:{a'}^ݹgz[S{8^"m v<(Lmz&.jR!Xwn] =v ݐCwCOz)'Ǐ/<#G]r RTV'kV9sdLB0jtٜ^O, p:Eu8Nϴ 6ULPBaTJbc%PuJPRNk0)nqqݲ'~NWbz{RIu&KFd ΒݫGgߺ/VWgx: t5j]( j@ku&t~c]x'Y0$KBܒ2L=s4XP1f2ؐM) Y%LQ-b\рbFFcRYܥQ%;!n7qIrB;DtK|5@1}nC ec[_X҇Ow̆67«dIxU>7ÆX4dKȜvԗow;M|nZ T8v,c^S;B`C:ň:e+de!zlVl^"_v-Gu[FG@P Jc)bA|0iD|z$. Ge}9"܏[BgtBLl=MoEz?9SUͱZtt6&.SEb .ל!Hh At9N{)i:1TҜl x5ReU2Y)I532;>cRhtT;OEYi׎>q &C 9Ӹ%[)b lMH$ MVߎv*C}z="l|{bT$)o(먕KIP@ *vY-_/7 XǣYٷ)zƜO=9z{'NZ<\4t;Nq}Lp-Wip|3J½l[)=yd+ hx!\wJ)9AU7"}+m2`1J D$B"C yЬCʗjr@"` ׺πXN}6C]*~^n:Q&F/;_&3Д|a;ķt/_/J[ y"8oe| }zS6 ??ڣC~в^ hX 3Z|`K~rtJ? ̤ؒУ X!kȮw@*l;T3 Cwd= vo΃FA+ζXkfMZͪd:LYSQ^/v>$!c$FQFBH:#j9qA#m9f5r6V)9H&~Tã.{J'(U^V5`Pb UNNSbZN#ӾFXR@~ S(F X B{Evgذ(ZxzƫieғsEwv:_r%F@?fE'/`g.dkq;|};:A|k9(V gm'2Hc(,(1%C^KRPG\cTdYSRSh2 AUXk\Iy3*jziƁvS_xh Mˆwoh=]GZz\0b8_c@p (ci&R68%[fir1R)YrFA'TЂ=FALs|rEoBUb;Eq\Ml/qqyoSݴc*^^{F5<2;e*$U*chrIVf"Ɍ=puH]a9EP"dA%NP\TrJN(y[[6 Ͻɻ?r1NV/8#BgG=OE4Vf3W[uHց!P J읷dXSB,r%6.A$Bby(ijRHڠ {g%"u=nZr_"~qhZ/@&K)ll yА6^ R~%d;{iǡu A,z(5r/jv419{$9Y%kE-;&#h2i8_-:]vj].'5 )*,sn`]Vo!x,E-A4[4#ّ7,Ci ɮPڢM`;LFV]m]QA!;@ G[l WиMp&("rIIZU茂g=:%Jyۃa&׌tAokOJ.EOFddm-d d1-+ũZ)1:.WPup6.8)6n:ؿ_1Y=_Pt},A~Dۻ(XNQFy`݀xiMcfk(5DYGe'DV2GY5!ZySZXV ʔT zC 0WLV9XM2jSYR!5C6R:$EP8r|Ujk-Y) 5|v-5܊BL;7BlG-G}>X붍sznG벨m|6i?g?U> :Xnۣؑ.2|k /C9cFzsտjQTƻ/jxɼh5]+k>ܼMj% xhjZ{3eܨ/}6ImۣqMq$,lgZ"ou7o.ZG}9)Gˏ#SeU{n"_E9x3x\m>ٹgY{In+|Ji!l-I6r7XvƖ<ٙcudI֣-d Uج*E0 |\ޑFLpB5V$WjME3Z@FMeVs)2jʫѕL3tWݷ_3SaݷRq?sdl#oi&Dh7OggIj è2zj^HV̇T7~˩c'u@Saٛk%lZpmk{1EwxW9Հ\z`ֹHuw6&T$Q=5y~BACmNr *ƭkm TwQ0bA1<{zjvߙf00܎xԔ2-CWD/EcFRӻSSμM8U_Ҥg 7 ݘ2_Sz=NnVh47̗YI{3woos/%dkhXe8?^ u_qp{rNUV)TUDrHZO)SS Ż!pN6qfauevrn6=r㛾enky%o~poiA% _+!oS3,*ڐ*c\Զrh}˯Wݔw7Kv\Pz,H¼eEb-.wkBKڃ1)jWԑˡ(hz.$7PKwQÿ3&He0Qfu<p)t;Cj!3֊^5c i-*=' ̉l TEORTx%H^1j빾a#bYU50 x#Q^^e;Auɟ/KZ|"E菃oưnAT#DOX>,(􊟑NUwZԲxSh#ǫ_ N$rN`}cI \nRhx<{dy^S7$wSIKG㔧aa&:]֜{@8pȞg:%}jmvQj3ZikSK<0gѴ!6[(Wkܚ)'NYŜ(-!zCڕ+vyH(ꯤ6Zi$`2H\VP/$Z&61 iWszrdQ8E*p.Ixq )&Z (\rYx E&6H+"2t^UbSŴv^x/J9q6j}Rdž׼6zG|kqUnK@x>#_Lvl1xv>Й%f:Ypmz*~P'YQKD)袭%!>[šLcGȳeZ4Plhüh1hMoK/L7jtpS?ލF ?Sp뮩v~ˍ3=U9 53Toƣ{ioj4Ͻ܎G$(,\dJceɀ`ItS*Nc@T4 u)\@GMFI Zǔ̺5 \`2pa-nH,3jH-፧:c\k\ "8O`Q\2䩰E^ .99\>&O-g5}RǛ]!8anm{;b{ñ^T`bJLo^yPŽ4W)PK7^݋"?wt@Mv/cjWK%I3&k{L*ˀM?"=yZnxV#wyZ5 m^[Yeр}ZJ*ʩ; +J^cd[m{EL?CU[Rh-G֡j͞6o׍|_w.UoهGF>68lz>a\N/7 zV1- oQo A8>dWJlCe ӕV$hS4Z׿?O8?8O#KHǎ)g|7Zs S05>:AF9-ʝ A%uB'Q&D AZ)}VHI9GBQ,T9T52)vcGbJtQ?2!k33[#9s_ťݍsyņxv4\ (3@KOy%gEL:z Z̳SV PeՑq/{{}]ۈb)yHH4lN~I |c.(b٦ $LB#Ր[4+0tT.SNT(Rȉ50C#| 4mz*.,o8p x┇ .삱xh$FEkU,P ~a%K!Ep^'iLT"\y y!#Ic>WZz;iƼ]U#H<yz(Rw$(4RŢ'  (LP􂣌8jhbjWӘ{ ]s3,@,!Rid,5~yDqp'|HdWo ؁=U\H@YϙOLͮKF+׊DQ$Q%i`zB,oL.YWt7_DBpo߂a%L"U` iE6T9"^kXJ~rJB}kЭqI5FE5L. f"&ygLD'-u=9pU,kys~*g|/q8qH'z{2,~el7{peFmθjҲ잖 ɏ-|ҡ|u5.\|ujkNnm"m=l{MyL)3"XԆ>k1G/pOϮvmD+Ǚ$%L_шkю|;tBTH]^*N~%@ǝ7FxanF{t~/df GLFWqjM?h9A<|%*ZXZE OL3Ub9̩}[ kp(A +54rg<D>j0Jъ;8SL:@Ϭ]5OZx~, vN^{!UOp:R]؞m4t~.lO|.lO’q\]Ѿ [/ %-k/p[_MƧKkXoDO -7 * AUh UX|R0/d7Ib16gk[QKVlO]$v¥dnnU8Ed4Uڪ2 Mg@}bh&luxatJx GIq3oxxs8|rsK`*/׫0ĈFXṒ2)5B)^)@6jr$1^ޅAQ}ޜ7fb$oaLGf\ y\Tg>؊miqU>U{%4S~1g9(v)|%1ak ԣM姿KCXo8k n7n1ǡC iTSsNϩ9?T' z:997t rCi݃E5[H?@?@kpȵ2keZmkeZVFVR H.l [ֹu.l [ֹu.l [ֹu.l-˻X=fro$rܗ)ZUSJ)?`9®eS%uE>(K$*eTGHL3P{GDeіݣ(#ڒˮVn+7+`N1bPs%T 숏dymunSwV$ l"~CE%\T.\M,ޠ7eK4R8119)TŠYac;8HXN%n&m߄'D_b/*7 GuŒxUmvnقh֝}D#DKXY[Qݹ*M%Kl-/aN(EKf-D~ZPt8؇45Lq ;MMR ^zdP YC OzE2D5,1aAH(T=cB)7j<[֑aY $X+\$JOƮ3]:#gztvh?픋 8s&%YΔ=K~pe\Q~n_0G\s޹QB@}R?~ꢺ%K X֋lFY,5oAjW\^!xpL2B#H}b̌Qe@ݡ`{SY D . w^jVi[[?.jusS*{(u~чGͯF1 zmժuu"P?-C4dRQcJeBsFZ珩u:s+Z:)RqGWl ,-T`,t0ߩ)C0! Va S띱Vc&ye4zl5ͭ06b%d7RgG2'3FGK Tb-0!94 Z8ǥ QrLn3wCR$ jxTD_> ̂ȔQXj{+%RcA팜ޏkG1?SrW*:㓭*j%w;/CIkof䤎w 0ٞ[3RLJՔy~_vQ:E9ȹzN8J( z` h9/Qh-J>RLzQ[ Q12#hU93@kd쌜؝v[bb AX6p pFQzEҌTlNQlW[]V[} J?{;G@="V'B! #bFVjTs\@cs[*LC2'ZK(IItY Ge) @"뭎Yv8 &殠vgܱ+M2jg vlH`"䙖$q,QgUD& :d gOŷ9hh5> 'H PIO*HuUYQ?ۜqW3iLjH3"fD\ʤʰVJ7G(h FeHɝ;ˑJw #$$8pF,DX: XҠK0I̥#bglP4U .:;mqw,bŕ-C60'\+Jʃ#)TPiA:1 *R^f\\gWܱ-PlaB7U]Vp^k{ υ0 @яc]E?? ʅTNI;|b N-,aQ%sNA?h޹TuV2SI}7!}<)(*!!08b&AhyB,KQP\kZdjػ]Ӷ u+zuExX=Wͺ1:zzL}~i APW`)~Rdg gV#i0E[6A!j#dئ.JxY1XP rq.v;GlWMޝnPms> wh5vi7/Ζ MHC4 a贕2]J1"ze0"O73ON}V>BOF y ^EU,iD!N,O֐KE g; 7Ge9Nr]*#O'ȓu_\!p=0r+ 'ԄX;VFDJ1SAҩXHdF`mΤ~XBDXKdvS !)C*kˮg8-9K]xC0a\8`XpA+c:PM ֚sƘGBjN+J'tԉܡ _ۄEFXṒJ3=+e ͍a"Ib̲4yhڊ4`ѰsFg\?; p|QԄ W)(7/S f)b VZ2ܔbb?勞Po&(NMtjyϭ&x8U}ux{N=~5ànD-`{UAXq8~0ebљhumc պz#ɍ7"H+~O ~$- bDlD-b2^6`ݢnxuxj&єeSߜehmƗ9\,:v=l e5Sh+ yT _wg̖$AɴDgDܧʯvcjgY뵳6U4fϋN|iNlpߓ>i=?^<6?z< ݰ?;_}N/Ufg5 k~}^2Yoǯ?o?ξN/m40}{/(5V}Xja9&\.&q7zK޶25^Ƽ)_Cu|C~30': ِ,!z wV =l鱟jirޚ{0x ˗ s7=fT3R gjzʺ.cCS[|`6U_,VIنŠܞ((̤0͛< AD_7om*Su76ׯHk|{U//@`oP{jգej͘W'2hЪ;l 6xdګֳ`pLxFQ5l7C|5t_'"pgvv#(G‘q:^玢ԧ " N1 OGZ? GR4K , 46)ɔL(!z(:9ͧW]K_k4KD-|2 lA}A?R"hlk˧{ԖO-_  G2d2lF0yaP sfwŒR(R"r섍K'Nd`du^H%Q;p.KQ\c-rɆlrb0[j~~4s֞pnX,/:yj ??R U&n-1LP( k.85l4Z5J-ft:۔j$ *o,H`PN8론W{B$gBg3Q jEB_dƗb.vx _1:$޶Ӆȃ>͍Egu;z}yw)_QRU~9-<> L?-a}y;S?#쿿0]Ru)97Gtnț7eWQ'OW#ؚG'O.I r O~Vǥv F\sW|qRl|c\9\q geg ؏^w6w pO>(4hGv.4nvm%^wK[}G:|Zh[5}&qPUtDڻK儿URsW#oAђ;4>8/.\ !%Y.W37[bsuR}nU&}Etnd[BHCRkUK!N?ԝ 3G76Kװ"dsa;5@O>/_'fЗSiyf0?(@ېLFȸ)ܧ `C@fJ 4St$ e  ,+഑'n-^=KE&8i9DtBihhO)k_WnCn!D3t,i~yDeS\MW'L=SJS'2G紋76sLBa!:H$w=YI gh)m6H+6KA( @fƙTcI#Жqb0AXKX;X$ jdn9{Y+W}bLD>" b`%Hx\h Y!P>Ln:(,Ԙ:c 5Eӗ E~fZd1'Rgn0ĥ>!Y X1ҤFP N줕;?j ֶt,Q`t\kho)R*⨣8z_'޿~xjA^%4y+*b!-JI,Sg%,xDmPT m<"D45%ڋz" ,|t=ă9# RfH"pcEʦ =g:Io-HĴu:A< 5arq!J,kt^٫Eѐ*3ҭ-(aˀ!yMՃMS7[)E<3qt F5@tJ ({d;S6 Z'u3BX fL䒷i1 \f*9z.*>HB]J\%zq =ޡC/J:&!M磱W[*4x:Ǧ}ֽ":yFo&m)9&!њ2K?g`cNLr&)/niAʥ՘[O%g%YDU\$laC C|~!4t(\BB3M?Ԟ^?x-,qnji[Y> *RqNdzYnf'v6g-UkM;O$Cn_M/c}hj *]q}LWm*ݪSi0%QƮ>9廟]&ph߱IjkΪ"mrX;SJjFyzx7}9IQۤ-ND{M^ۼ ˠf.o_ގdpz;K6Wdh`Bb+ؔCt)k=,pľ^3zE 깻 K_.iOo>^_`@۫Ԑ+\zTj;G(6݄&+O#xvx8 xbl@KZB0qŸfhw>rm~\']9уdޓXD6LPUJ1@N}9i 90- e)#୍atQG<7R$89{4PPH=(]ԕO\!vs7uC>uiTp E(p7yѰL{|BZx>r72`R41ITZ84r<3)h {dY>jIR Ke| ΐ5ݜEĐ~)ѶYT檑s5zaG͌8b3z8@Bk-gb{,gԬhiΦd)5ZGZ=qs%1fѕOYۼ(]IP7h n=ѫ;q0NNK|y$ǴӧHD{N+\tώZn*eAjkwuPǣ?M9=M4W* lM.߀8&$Ź̌~WjY{l 蛸R3s^8^coG?vF효5^,5y)nVKSԖфOɡԟUU7㘴D YI,6_Q#MɖiJqci~"'c _kjR+zZaR)!yI-Ƥ,Htd`*s` * %$XCKt? W( bB֓gjigކKG](8fE8 >u e3V\d(wo19xHIHJ"U0 e"g'dl2j6Jwh->9(n&*yxx*=ԫt5#^TEUGq2b@󨥌*gc ɔcG(`iaCU5Fc̥+r `̒ R=)l 1bL"T\ˤjkjܯaz]kqƁp=谆.<4nlЅ{x5CfW(x4llk$$"N0ȂA 1CzGCLQ0AH-FD88sQ0齐MM#l4l;6 R`u)`S-r~2k&f_v5.$؞(5YZcɒe,5 H r 1f28ٿȧEd(^&`gxf,i$Yo0dErۖHwENXt,$Aw*7h!= eMK ǘA=ȐctTv,;CfyX:8Ĉq_$bgDKD~⵵)BJ,H4'+9 x%mkrEL\nw;HG$C y)VI{(nϜ|Rg,?I6O{QɮrQt, rqM1 h0푛EFrrTJB)r>wvECٱ<ܱ=3r?c/*3 CH#rXяRcd,5'.ƞ &B*0eGW= Z p^XI!'teG܂#EB.cT4ZLŰ4kjv8)~[ICW̙.k]3f[ISXӟbD:sItX k.GĴ7i)_ዦNǨ@op/+n7OŵNOur qC 40b hf(3ϙ!M;,Yֳ.|`yԞg`KMPdx!L4m?mj߻n~_;='Ee "b(E{ m|YH'߁rkG\B$ QEs@ΠsRӂ{NYa Lio|D9V FdrRGΑvØb E;O+}q0 }*<(zЮOsTy}.˯ڝ?2*\(+9VS!x:ଈIGo!Q yx JE$#@_u{m/|v $jA*D)"A*E$>B0c.(b,I+ƙ4"!G!{V%`騔!\"'hQi)]Įiݙ:C%|/JBm#T|*40xol'Ny a\0LHƢIIrj=@,p-[GY^q.y2QSs*$Rif`jufj'9w Q'QDzj%f,$t(0Ar2M<*sf Վ+oaddF @ CȸA,5>E TŁۺA.&O|q'=EW$+󉎙݂=U\H@YϙOL[ P)ᦠ/Tyn o'.gJFKAi4 =tǵB' 9"Zd(@\h/-?[AYzY[Et0éɅ}0DirהPQ)i0!RZ'8;jP #% qh54" $"&9݁e]|pAHMQ٨ML2I$Z99GsN$ހ=?&D]?8a|,vGmŋ-y( U uZ:84_ϋi޴6̛/0bVյ83WD02#Ǻq#AF"wQU(sy9Dwj[)vZ\ =KV>"n$8(oRe4[=)~ڋѤl,سf]Ϝmi* GTo䶔cգ19:S׿g7;ygλ4bOe.ۥv1I?.P4h q۲tzPnrc@xirne{4OD| !a6pqZ[*;cLkXP AsOP'ub rŻ<廙l_+^γMB^x]r_<s "՝f;oFh ]p~߾n*s6!(8CV&1˽.z%J:p#g'i4>(წrQ hKIqh 涕:&y>-ϛ(|7׭Zlb\'śQRu2jAPۢʐ.&i%|O՛׵_|QT[4y?iō'=>*+樎'K˓Vϝ^\yyOCxAOhW!8"~h@uEtMLS+b)e16x]:>Hlm>0oqd࢓pkr n>#ݨ8sQXQXίL8 hTg"^Tl8?D}ntϫ>SH٣nQH-XBE48'+o8<=P晰"x) Z*gR$v4@P )K$)P5ܢ9&sҔ-iyUA#PKu${ A%ljеnH $6HN$O6׭d?[>T:u%ǓȉQQMۘGo u&AN9 O*ZE\a=bYxхᐍQԵgw7#؄G#{ !n`Kwq- T B*a#" "z]MDR^1 Obb| {pJ'JirJtQˠN/OA'(fmP3_Ж/ rg } xsuv8 t$!Kc;WHQ@6&WD始A\[?KHg^Qu$(7K6ߑF/_Pgߖ܊b'g.#T7!,]v\) 1UG9ȹ=dt1^z`]/M\_=1됖R EEZ9-n#HNZ'{bjU] mW+Cvz8oN3 n]o{iCIٻ6dWcf8:zMrT[iR!)+N~EJDMGT߾.grvvEo^頍<7?I?o2:c4l0;u "D xg?kMlqʛ\Ғ#B^T) L J4И y?J~|Ǿ9BIUdXB9%:ϨB< CPY$܁^+30wAU 67ü{]i*ƒǪ#NqKQ*I6^Rm̭ducD:BfA9s2):I@$ TWw *oGNb"tAH <"ڃ' 2ŭ&&]-.je9ÿ Uk:}*i¹LOM,(LZ@kH2r_$1_}&loEmҹ7.>ԅb:4%'j䉓Aq5RJbb9E`:KA-kmKNr"ͿM°Jq{ڵDh]W!^ҬuۃY8FZzWg8M農7^юP*Ӆ/S tl~ kE?)z=[Wnn%6m[U?1rkM"9 RmǴiYeZӕ6G*- $vc M7$Щ|qT4O]=J6QJdtϏh׷wè@oHhHƩm\oi{?ofZLunw/LsºoYސV{yZzS"X^h{3 t;`Ȥ@(CQEрd)iEh v|3ySSl#64¬ 'W0r˫քLlb'~>$w _p7qxovJ{Zw-dJɖ4%$}Ak4Ygn ~Gkf1!HО2e'P 椕N`H&2\6՘ǫȧϠn9j6S2~v*301g!sΊ&byo%V?^utaaAJ]沏rY暭TD@hn[}<3s/G~2|FwUL\[XHT㙈Jڏx1!xcЌ[؞ [R?$&Ga@v̠ D(uLimcZ >!q{>85g'mcu=l?񗓳=UpWZQht*rƍ.+J,eF\}!:2)bitk<([$.R21BEE+j 6NM@b_Mzp2ֻ#- /'; 99=Yb(2pi&yHK$DtjqF_E)t,,?䧋kkj[^lZ6[i'MFˌX9V}Ҝ;;qpgwVTj_ WƕW-CUO~9|~ E1 6VM`t=8dua!Ս1ǠQq:OKoЕoėBIȀ{*'n1VK@,^E vݺUkyvkvQAE?GCUy"Y>L)1Ctީ[I3hiq!ݻqQ.[xjzLӸ麮}{[8K&Q4V/%d\bL o i;dƪ$H`L>&QUo)R1/Ip`Rx$mM ܴS#7ނ_٫= \+ƃP:sJ@bڸ0:J`&xd[HHj#vzɲw̿&Aڐ#?>+do>WŞf3I:˯}Xc[b[}'Wa;N)7]hЅv]h*څv_CxC#( BEP.T BEP.T BEP.T 4˭5Ke1 Ƨb IQ_V.Ȕi=[8Uj^{LYTHT"ئ,J5N{i+Ʌ[%Zަ*Jq!Ą=& &2p!* yZKJlD00EeYM'!tПX^jkSw>g i4wGG1b%W?OSM `KEc^j`rc XN 2Y'?8έd:%P)*q˹a,$=(E\#qTuDUą12Cp,%g29qw; BWf^> ądr\r`NxDD'ਘGe,f u&q9]D+^JE!"1ZԂJfcZM*+͈h)p>r6v1ZUuO2O*YGgG:`2zޭo[;I~O*p|=W?lc)V>mhi'-eR9Ld% {b9hLQXIqQWEFmMN) ؔ%=IDeRvA$O<*gUi9W3Tmd&XTj3cK,.ra^p.Qi0"M/'M:q9 W3GlZOL(FUT"j́X3;&ELhnupTBlEVWVfn(ʞO66QC qpduYd9bfGypTCAjPP[t1f8K&f"Kgi%p\b,pzK-"xYE1XIC&fȁ($dQ 6j"Q/d*vXMUq4:OXPeeD"vx#\93nsR<&dFFUͭA1/kt3 n $wYh3XɍD#iC!BeD&Èxu4ԁpq!:͒mqQUEp{-Z<# $nv.KU\2e\ Qpqx0Zc[<ԕPo{yOCG>h A'LMÌe Hu@zyʗ!|{$ԡxn)zwZRح|$_JK:eAxgd)@E)<6<+]}$Zh1`ghN=mRɇ^<~MM荎2H,Z"YF2Gϣ IXeqryq(g9L&vW> `խUbuSGRsmbRL y_p@*Z(FX+,4ѰLahśpxAJ-8ITҴyឃdRe˨s,2cYH )jKO޵M͘faS͘FYLD~Voy.cԦWv NQ{6d EF&"@vȁ!+՗̘"WCH1`6k/OQ,MsSVBQBŬ.1ҠJrhmTkP`U2w^m/|d-?#84DL'-O"QBRr/X<3:;_ӏwTXJOhfՆUIf$E~C-`,cg=1\Iȥ\6(!7R@%K+OT1vOץ{yd'%3֚$2ŘpB NH%q䆞FZ5><Yk F/i9@cvǭm]"n[;W?28PZlt{˥՜`MgU/G2oͺHs1]\yJU;{WY}j˭͟6L2׺m>z7\Sz%;șDe%C|$5?SK;6}iN{7am)5pu?ed4fꦅoC֠}τ]pe \S 4# Q/UT&*VJcXm0%jd/XK ̀yxbv}_]\)mt*'~};D % x%$`uX 7Ch:B!:u@RL8],$£HB w ,"}oҘ@ϳ#' ̓O ZN!F L䊂9V81R(bN)BA|\AA1:{)nv-}7%M><(V۟l^eRw3|W- N?qehW)lUyC'-ER3uLDZ@9 vh2{;c$ϧ&yPh:dL 5Jr&**H=QG*[ FiEuàR',9^<'ɂJ[S$f-(]Q:G.:dК}qʱ1fZl}6 BAJQۖ]Fp?-Km͠ &:  \+I+ȝS "E&o&X'.) ^:*kgQiOg5 )DARDjNkNo.+p]t4 JiG4Cק%4c&%bC Yp=҅g+U"]ݛij|E f{?oi6A4p D X_kXrWu n{7r%N-KwWXF+:YZ5k>H)yx{UǪ˃?^yI6xKL^F? u;t ؉Mf (*!!%ū_@9}s#z+\ImEά~7KA 0G#L3u5ٰ՜[aX{f۹ʊap5-ޓT| pO??x{_ͻ_/m md4`}ywߦg8u:|S ik'rAoN=p3K[e!V ~';ʷWb5Om">Ag.s2JM"pG;e(s2iRcov{&.@{p/K%er'ZzCERkq`6JӲ^x׸e("wK/t6ԆVKmžE:\{}2mWpiAVQ _r̆@}ʁ?R c<N0AH LBf *>g. bIG\) Ԯ]2 *'Nh̽*TY$.KQܲ:dtrL;. Umm>Z}ݦ%E6ʳߦOw,蔕*r6)EЊs%תLgiW0SwV\jb`(@UMTWntخe^PhBx1 \[y ]ϰB;$tbS.4`XTP@ [9.bъa L L>.18Ӂlvg0\=U]a;rk6 =ۘ(ӴΗ~&7YD3efk׾JKSnxyŖfeݠ*yy?r(.U[.Ġ"zT)g3G@D9"j7.smRǹGW˿5L&舊vIkJD`KvbDRʭ0xAڭ,4B 'BNBk Ǥ&%ʒDHjdd|:{YWl޳PKi'ॽTbcp$F;%1HNH%:;S 55&!Z4;h}*=3ӿ"}9q6k'u昸 es^XQNjM Ag'uvҒy;3j:  YI [!Fg+lD.DZ&$T:ю8h⠬ϛ\(xGD M@D%4HHҳLU)CIHGdzT۠HSU!ܮ&*+caHPS-3"f`h΀ E\$7Nlќ`L!i%imbڡccUa4>hB2w z5-kiS. RC܎6tc ]HȹE@BLqI~EtjS -hg+dFW$V62U)/ "*ߣf81u<NYVq-!ga1KV0c2&L,dD0V3TBvRTV(c8ą`Q ѢRc:|4XҶ3j)MŪ>ۋmyX$d#Kͬdݲ3&0P56YDi% :NhI+苎bzӃe"ЭcMp6 fH0:ѻTm0L6B&qk#HHT SGC&R <)A)'{;vt &bhOϥb(4MRt,?ڟhڇriQbΓ c%YtDS\$\Hp!eC|~!4(Ia.'f4 KåFRG" 8E?'gBnnEKBwH71u 4{292-npZQWijڿ'm2tؼaz<y͡K}ۯkŭfN m8֨ؐg&] 5&2;uޕz^ ʸw9@K^Śډ4ԛvğ;KF>tϵ/i ˛} <.!&Dw9G"1A5 4|b1gbZhcZ+/|nwWl$Gܮ(DrJQ4 WXсP>_xk' 'q獑^fA Z`,y]G-`nj8Uf邹gIvHܞy32tԣ kV1I-:"̞à(7kB$h:hVw[DytwϛZ/7y_4jܬf gofߠF8ljH^HS W] ) VT OL3Ub9/ӱ+ʏ';<)mTBKs郕Z) * 54rgu:%DHABƀ)t23fIOP7+~2ul3߯Ven Qgz`41ũmo}׳sᠺcJ8P".*o4V3Ή 8݁Cѽ \RiUU=syX6xʭ>.ǒVh =a&]:>0!,A./1Pm$LÔc'1i"M.)Wo!ħϠßإ&<%+u3aDŽDjE!9&sG!dtO "'2Z0+aΰyg?OE]A 117 ~hU HYB۴y"wM-b9?6GQ~lS(?V!Qq-%:!s4rHAfx L1.Tؔ1Y\T29p֨h(UXj#"Jg"u aRTkƤWܢR!”.rTLg= Ԗn~d{zGAiiN%)js=#LP8b9aj~&s!;w7 J" \ (_ #S $ƈ /Ӡ J cP%KyZF6(%2PPd )hQEh\"5&ytF:Fo3THY"!(Oڄ\[#Ҏt8;!)w$ۑ+N[~N\BhìPZr#g(z!&-AHb;*<-k7FAa914eZiS骔N136>x5ƙy3^]Sq/Nn{n?$t(WE>D}^~! >R?J9 BQWj[`)jc{>s.Wj @ SCQsEўTr&;X"BW+BeV6n'l*N}jKT"Aovᴞ:L>:<HHwN%(.ʹfҬ2QJCm6tK# KH@3Q9$<2Ϥ%W J5S4!I27eѸu7얮Ux˅!/(x$KWL6w{}:[y?7w9otf{4Ѭ;7un7 ?3h耍!gmovws32wuM[:n.pS]?5g?k=.c6Oi+g9P[nmL9U:^u,KTruUDzc_`1`hBqWnD>ך KJ҄@*LrѰߏ>CƠ#U1c3YtE, X1"41e:H,9M6hGQGҵ8?YN˺ZģK?M>]6=0ÇRϲ$QA'0HW@+AI ;[FAJ ֢ r3;`(h8c.Wp\kדKYE)B.T>JH\J&W`PLrrewjGs֗+&v`09"ϜZC<pa' l |J=AKe'Zy OA:8f8yx5dgB!31n j:w*‚uZ_LN[uZqdhPDJ[vn݅,ėJxV.F&a\yKTIB a#"MBFoelrګf&<`a|iJ4snFwslCL?y㳾jpgB$JŢ$^9|Edxs#[{Pf_v(ph^^'\@s(EWUø$^s4g9,Y3_Ahd~׈;zm]KiZ՟.G>vnF٘,>??4/?Oo{͏o}o_ 'C?owc?ǯoқC8φwW0PE>9,t571̿So*f1nR@2TL`n' {?_]2kK.P5{}h\QnqjK>Džй?q%%g3yFYzmq\X{]V;z @7}q.y??e] |7KfR7SuP}ئl*y92ћN"8—__:n 6R~RaaY$-}d3Ɋps5l͸|SYlEcob1%lȪ_|?޼FH2;~zpgGCメE4?Agκˁ[ND.|O:7˻ [MQk2p0v lx:M٠;ussgɽ(OEhUo^x'DMͩLPF"7Q `%RzNr 6z*,+$w~")Th@<]In$:^xzB6x}uaGV`Z`2DLn hk\ }N$]~,c ds2v8@v'J^&FɀE<0D2ҔX*'lR1$#@B0ZFwp~[EWHLn*!zԊzJR@ JL3aZ$?Ba,XXl,.CkQWhݡ@AUvQy_eR|g]%BZ) BVn~O f &A3q><6~pO'cq,CMD+vr"AtEs@Π&^NYaO7~PIT>iԑsT1"O]?3Ѓ(jwq!Yk)V@m(qz^jܤ[l`ɗn;ʋ⼣(JOy%cGlHԂe%28PW܃:={ $jA)D)"9#AQc.(J-I+ƙ4 G!-*0tT.QS>"Dtkz6pԳF~Ԭϗ VAཱz38!(qK2fNJPkTV[elQ!U"N?KI"uR84\&t r.]I^>']8γ4ݙ:#96ET1[am+r|$c;Kwqsc `g#*atn\?7<% pn&o]QUUklN+/OFƳIՅUs3 ޽6١_s/(l7CbLi o"95oZKvQAm<㜸5Gь _/q^5@Z۹Aպ ߔ>Tl/keb!a(f \Kh̃ `R<8P̸H EpIE8"WRvH]E 1=gx1 1A8'-*֭}lQ| P82H6L]ӑ?e8WBD}O]Sn^jʭEwĮXuC7t4)2$}5Eə%%LS#dicr^| /y"KAgkrt*e?n\*Kv=?`۾8ߖ]_fSc);]<]< 32L+ϭ#flROs9HʍrĚ:; VDy;=.:44NÚJMm vpP쒗s-s707+q1̏-NbNG?<KlVo =9zbN*rJ_xr+G3ӳ[iQFGυV()5khu:%tՐ+ъֳ֯K9ZWBzYT\rQSU3lW49/W򷯧&Jvn~_Gk2XqwWqxo3H^՞͖\u^[~glky~kgXfzt}zیH/׵CڽEWtxr# my-R .r>RB_4/ߗ^%%xP <l}ܩu/\[Cq09RA0L\Q957:Eb xr|%y|ې'}Z0v-.30>CGÙ=+3u8CS5DL$(H9w5X#kcCsM3U*B,Ho0PNwX7Ū׈w /,KXb[3!\͌Bz_HKBz_H c!/[HBz_H }!/?Bz_H }ѶBz_H }!/ޗdmMA3RH }!/fBz_ }!/_H }!/Bz_H }!/8Bz_H }!>v;e1^a1^8 x// #*6b:h?S1[*.Рf*Xm":;Pʃ-,{b6 s$q|r$Hr$h!(7zC(LXIqʕ~E*܊R$v(w[ %QMn\``d8p w]9gfh,Ű+fT^Es5oxY>a>u oj ^l;ȍCylB@hs׹s==a_2[ `[7ϟ[mMtCf!eInzwzmp=/,xC{ϼ̏Gi-[tQo7lEk;oSҸ/Z`1V؇)==?oΨE6yԀվ)qƨ:.O~߿qΉ5JP8CGRmy䗆3bgc EGzKD碝5]׷m7ۗ<-$ZvK]czsT}7MAv~]y7'Zd4m ޠ.=&Lؕ[`s\4V7|W˓tShh|_,) q4YJ%gC0َ>/4h.EɎTɌǿ,:M> YX4cD٤qs\ JVy8L,JQ5ET{8sk])լo9pu~?^d]{p~0,sGVA%mh :a$X;[FEuѤXX,ˇ`P Ïd4eic, 1QF{˕޵q$B$;rߪ/N| aWI1E*$e1OpHItSc|CLjn]պ[|Ҽ`ˠ7gExVokW2>O Hbi}Lmxu=7 =^vQ̽{q}џZny6jW/zxyώ3SLs;O"c>PɈsDMz{OO^ej˚=[ bo rf,s,ggɜnP)o_NOg9~x{߾ͻgߝ.f^i4 ?}_Ƨүgrrpt ̮'Dwad|0E?2W^̬1n|"?^OA~ywq|=k#!Լf//+u?Onq ny0Nrن qii_m׵"9 E7TnvUHc~kԆӔeb.LBc.:ܺpG:cF& Qmwwma \O٤ }~1Xr'N͝LDnhff+'ZUϷ/<ݡ&HN#֛(Smzω-d*:B mbiݾOqS*tXW P)+ʵUUX}Eu2X.u)MeCibSٞ{R>T9t<2h2JjuLi`֥DIH ,F@ez,3j5XhHx㩎F&` -.AT#g0Z/ɴּ߳ Y)p{MOhTmCKv~&!sRv˰dhet OKXu64mp WY.'Z(TK* $Pec4$?sIHYD(bi;p)$6E]uꞀ0A"EE6Q\eٮdpEGÅ=?;:qBD*ι5hڐ2<BuF#.YQ2}2*^V8-,J@,~xF"lfI۬ټ]b"uA,{TYq!-U)HIC2157`"b xC >^I[A%pnܘ٥uqy`8G׵#*)‹j2ȗeꝫ(ךY a4VʩZt19ڽhwuAHdAR=-[Gv2=_ހj7Jl`rJ**C(M)KavO1gS E{؁6f|JΆU8Eh%* s4r<ILP s63כ5m-Ufmc*{=9)sLyؿGy܀F{M U4OM%c@PHhY#Tc\X:IjDhRܢP%9KZx4N҆{G+__tVUl)js]YdYjF9x.譵 ^v&]dtCjnm]\MŸ}8Dl odŴ数U:V:ԡNQI_xjU *xY%3i񎨘X0T:!x&!qVq8빱1TR$AeI 4Q9/SRJe*FhU Zޞ6i1r=fe̫8G0&p9E>}d! H2*a'R%*|GUEqfT`VA뱃:_꼕Z4J0N=%"0G2ϵEў\G J,?xA% D2*ol*xhu]PAmvC-n >:H2XÁpQ&QN( h0RZ̲ԟ}zz_\~@ild٘ b;g@ wzo^ۻc8ΊUh2*8p\b&=6r}iAN~7W3 m¶"kDgS#Ml'1?W?jaj~Mq%뿿*`r $;T1Y_ J;AH eܴu&71͉L@RC-A;R"hkIKJ2ɣ*F4NScpruP!"e`T<j[N*da9;:mUm:QiI`c<\Q1 s?8@$eփ1)ƅ؎ ͐OEAaHG%J@2>;[yp($*'i-tRD‚Z5S2r;ʁhWTVOr*[F%T~c4UIV+oI/N.ίvܻ9X{C9V?BRF9))5XR\{D"!cL+fZs$x)C=*$ړA$$ TiX5c9RL=uU ]L wԅs&o(6YVboOAeMb'Q%1I^IEԬu5 i/zRR> ax~ UuzGWW|9Mp+mv]ei5 \rٯ/eGxq[] WCWK6XޕXZ] `oVCW7صЕOW2y#+Oԕ+{R+A ( JWGHWr\U1ɸЕuf-t%hՕ tutl4++vn=kWWmNW29#] ~-]%Е, pJPtutbL1r۝*#ݹ5nvslq|/_/OAp~27{v4&5yuq6>6d秔/l̳ϛ_o5,>JW66xs}lܾw@ۑ%cAn'WΛp#oo cAh7sK~~cfCDu%M.}Vs.7U#kW v,'lf7y5jUL&(vo9m:܆M7-eddſ=~{˧'aSl/\.|Σe1o=L)<}Ôo@sM~Αn7`Cn [W}rӚŏ#rGd}LoSc^HP~c_-{aeQ7g)̏@_0f->G y6Gqfj4 zjCa+ۂ!-VJ[+|wDŽ͍"o*y?,"3#xr=&sxb3lɇ}~5fWKe9\E҈LycMMi_g[[pg,4lIqL&c]Lj`2MZϣ70:Er6KDE& zriֻsH}4WG!No5~T\13gI!KO 11ӹ`Иӷ7TܸlbA DE0sf^ۏɶG4ꌱۈІ,,bG&gk*V&?!Q1<>g ǀ6 ߙ4GJ(q|Gz]J9Mr۸~{VcM4't`#:WDyKZ|HrnaɁObl!?7B2fҊ-x Vlh)۳g m-ؖzs;Ӓ|ز[r;ŖA\ȾY:C/k0`@f3eڌһ`M DL'Зɍ+JG GM`1lh^l%6=d& KveRY[ 1k1 L(}/XXg!8dGh/â$ډ03%"_V|&6HC<`%r-xAEEg `0O4/J~ől(. J =TbՇt%>,dBWm2c/m̭i(VDJE`Ӹk|DRZ6f&! Ŝ`' ɷ;;"2 op2v9LHp}/NyI +QJמꇱN*Tf0S 5$`-me=ܡ 2y= +V;g2Fͷ>Q%.q!`1g#a n1)K)uxpl-KC%:󜸇XP%@ `=~U\6Md&X`Ttg#"Ł.0͑EUPh.pՃ`h 1vlș0H5CBr ,Rm{%Gࠄ"$Y,K.d@HL,u{4H.4bD U+cf#*23ep~$ ڥ_nbE\nMHN'cE5#U1E!NȘy;Ovۙ¦p7&c c-x &01ٶu`dmf`-$ >:|@uP<`" l:@GvY,2IJ3.Ue lâc:JrBG_(Xq<HiDm(2:#Y!x d,GѲ< =K*`KDUW Y-D ꖚt<oĝF.9Jrj@#;3PFFby3Ev1Ab;Əǧ| wGi%6L1f,M&XuOXk tdDrHc㆜m,5m!/sP6D2债 +& 8JO5Ҙ j3`[gBܶKAEڕZ1FcGX $A-DGwH|B"RjF_P=`(mEA@v-WdisC)a>֣[6I1셍C`?f+C %>-Nnc~'"o Q1bpR sR5 `:JxƀguK[s .T:c*j hN~8hcaV<zPyXgoɺ!LFd; =_QH5Ok%=+'נӥdȃ7߽Apcv ܕЫxdQi`a2SGiV3 _̂B\Y<2čtXvo*CQGI JR>#t o:h Fl6WҔ"&b9ڐY_ÛPTpԅN f !KkbC9TX\s< "搂TqqK6Bt>/֋{W0oi&M I,P%ps?ySU97@_6 Ʀ"?D~ֿ۽rI{ _qI?g*Z! $z@kj@@K{'I舜@>:N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R'; B\(hz@jȫqO{舜@ЫH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': tN ̏;PlSZp9 $+|N Q@I@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': Nurp0qn4qmʜ tN k; N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@zLGG?'|.[M݆?nc}'M ֲnE%F}җ-g\ݸ$(W1>zg7kY `+ZJƃ+A 0G].\)hC8t*]!]*a5t%hƯ+(]!]yѕXrȭ.Е JP&]:F dR^]p ֮+W.hӕQ*WDWrp)ޯm<# eNJWGHWBfEt(K+VS ZC+AY*2ջ3w2a?*^ ٫w@w;t^-\#P.B;ތvu_nhA][2ثMoonmۅ4^^2@3? ! 36\'m@Mn>˹(SfZӣjֳ:!psXKZoݡOAT} F&k\ryhW<.\Ϡ+tҡSL\] VCW7ZJf>tPP{R.s g]_\~_޹6n V^s :3a,o;df%d$GFleH>|IQ$<ߠj9|ӣPN ⻟<;v\U1rrzPﺸU.'mbEWe)r¹r̀`qp_.'/c\Zv͆et'!GG{q E>ϳ$~2@mS ?n>*JrT fhq*?-VAìxsùxU2){Z+梯}x::kq[}u\ћ:Jm._~ؼE[m+N]kkXzVmUsW~i jl!!pTb5UR~X? MHWl4]!n zu]g<誇RZEi$ %+P u]!AW=ԕָ!]aGZy]!AW=ԕ1kJ{:cW+DWH+xu9c/ue|Y pTtRw]WHBg-{xZ~ﴈ/MJc-*Fx[>*γ4acC/͙H\\lX^$˳Wbynd&y :ʳs"&|jaJXc Uǣ'hwEbۓ $DZq'ܫ (.U^AAͮT>hof$Tۙ ƙ ww+oߗEqzw 6'Q6_pK.d[[dXFQiCqI)-줯)Hn6 'z+\8ι*e>ꡏGPbhsO3 4_,f<^53CF>f>|Uj{m/u\G+[6k66RIBk/Dv=RA:yv3y:E$boSDs/T{;g/=4i#x9JWf^6N>Nӛ|^qyWZi.}2W;yk6dB)sy`SEZʰ*:0W|TpaKV*b¾n;?>:_jM|$;T~"!8 Z+Vnvȇ6` fJ`Xӹ Wj{/D޽yU&r{f>\aޫW4#S\dڜg7`6*w`1k0U/r ˼b6-;8!|6[i2HAwhBK6Лoj  &ܺ ^y~qD?Kj͡*׸]$^IKSsdgO66)UNH7C(f6)Y}SM?ܒ4qOwzx$c|T^|:"Jw^tC Ib젆lm}G8%8g$>USjs=49U.򂚜Wc4|$1yi'4ary/?ᚵ׬}{wSJ;;dxgM q% i ﺮ҆誏2RB`GGW=]2R]PW+&)=D`G39*Bڶ)GD> _s, ?K k*Ng.RJU,Lg>`:s [o5q uLEJB{-g?lyˈ+̬AZ+ _lŬfQ Xhղ]գu-sQRJfk]i)!]!dtJu]!ei[J2''+V\E=ZŻ+,誇RYi #+ud+ 惮z+ͥcƮ2B\ h|tN]PW%(]!D\GFW@zBJ> օy˵;)u]W+9]!]R0G]9!]Vu~ )K]֠nQ+awq'rL.UxErه\&-K&Aq[ꍏiMG"V ?*m m'!^NUz7mj xq=\wa'l<]Fwrt|Gq٣T9|@ũT7U9*JKCE38_~+(%P) |ùxU2nw@id5_G~͹.GU?rHx3sћ:Jm._~XkʶSךjvUt=gܑ4V `Mh$ q%Қw-҆X(EHW ]eV ) ꡮOV\ IEWHXuZ]PW n(EW,a7ttVˮ )MUue4TtF 2BܶGK(ꡮ322JӻKL݊?tI 7T-CǦV&Ʉ6X< ;Bדy/9DNj3z4=x9 =C[t\WH+;+T, J\l<.ڶ+nE5qUԤ5VTv+V][ !]~M qtv]WHٱ肮GW1\z?j*CEWHk\uN]PWIg(u9iJW i-뺮c+]=433B`']!nsku]!e:]= u9ޑѕdƮ 񡻮+2DW}ԕRGHWl$+d:@| ZXz]9ntNr2\{hSJ“^ʙrtՕ݊JXUx˻(^p,'~"sTO5>R ~YL(5k{0hytnQu=J1]AWf=qEHWl]!#+mEW(K]GWB29!]l}"`=\ i[ &7AW=ԕR;AHWl ⶼvMZ ) ꡮ伕teNgq+uBJ/z+J2B\O&Z;+a­%+ek2B\Zu]WHY*AWѕ='VY +5+5d "w]P :}ԕSIOHW_&TtBdrF6BfJ;PĔ6B͝tzO:egC.kV&[fyd"[h|NQz-Jh] Gbz4b֋ZuUضE=ܶӭE+YK֣{/jJ]\;e!]n}zmuGZ۠J:40׌*Bڶfף<誇q dtQGWHٵEWGFWtu>BJtG]igt=5m[ԣ:誇2: dtNSѕ )KO+az2B\ϩ h-w]R]QWAgғAμ+d+mkz:DWԕa`3 o\)5r]_zLr] {<c:!C7;Qˋ,Y71^"Yg˝qzz Y&@\O7?3ߜ@o&99ֵ~)Bn/,+J;כBX]Nd<}yswp:)ޛߞfP'Z ?߽}ht_־K몺z<.$eUiv1O_^aha?a8=0)>wjb9,2~W 8~Ye̪w wLi%BjSH0;%1R(ύs=XM97'<ɥJ\tKhhč[ Pjd 4\r gz<͖j=ϠOŀ7 8^juԭ1~AWMtoz\~ݢLT_ӧ(+Bϲ`,wZYznd9(3X7c"+hV>]K(Nׁ|O"ݕNׁ,4E%+S 8K][s7+}Jvlp>S٤M@fLZO7Q2#{T 5FoA6"H`c JklJ{'E#2.07$rc?G9vm0U]qu[>_i]YBd Hg6-ʼnB&:$cP&,AAJ`M pF{ZUV?N\Y-f$7^-V,w,0u7](ktP[NQn]jis^^'J <:9eHQx***1&e(4kJRim1途E6C(Fb3$)dRQ1zVyjLs]GMpKyyK{of;+Ջz~f"NO%6~?t>#Q'7 a!uTǚ'}'ygJUF+!ZJdQZ&1ɪ(AWDҀLw43J}k, d 'TYl7auuׇ[/.^~5;wX7@1?A>arüd=8hUnֹ_g XBӎ [:ЈD#D(yN;)?(YڳN-fK*ce2 &*G81߇妬!E1 QFhiH"rH]1h 6:k:s؈ݧH=ٲ8?{eR{JAqjII!PpRaxkBr;i@ '#|P"([@ l[Yʄ*D&Z9z/ 56f6Tz1]L3-eG8{1%Wq?>|vv(z_`/vv:wOfkվ[?U|x3$`;NlK>ª Bcd#dxɎKJ)2:%hz(<U9RZA52602*Ͱf1-e +3~zH˓(.FAx4Mi)2b*YȬMxg2eL9 .f*1% 3$:t1Y Ҫ¦v>9YB};( \ITDz8;NNn}Am5`ibÎRd͊*. 8%D6PKBD !g)VoSCff( {Ig$S 9&Q|4ܛsN䂦Q }Afq,"ƈDq")EH(Yi3[dfOiB:@N@ lg)}cFP{-T: %4Id#b3q#⛗į:2.κqi1.\;% dPlKIm tZ$LJ~ŇwҎc5Cw>mlNcG 3;-( я)GT6J?EU\K*}*}:Ջ_~"N4֞\v[.Őr-y!EẃegRRhaEjJ!F3z3QJfm?t2a<Ԏ Ǥ;j E1D#l2 + VHcRV*jV ?pt ٘NW} XxwF,`~jgW}/[>i߷~Crɋ-[Znl:q?:|jOTDM%y5PuIIa+9I_x8VFc Ly ;g@O yPh.B aUJ5(ZSl =e@&oENEÀٳ+o?NΧ..xbH^d:gwDϛ^Ǿ\X+N_@-J޽a٭/-'A{Zz\Wtr~uh-J%}hyTeYaj @kdR5Np~7ѱˀo~UyQk({>u{.{s}6)duq7^D^ތeW,w rgdg/LΞߕң C&[xD-w}krwh8)>r]ûG]Awnv:;6wk8.Vhpj"ʮqx@VMg&xW=V6[ ,7gW~o\h=ࢋ֝;[ U)|ѱE'9٫zY6w iI` dmҢ"N8&#d4XdüOYԧhAgRfI*f&t.xĨ8>8}j[suc~>ט*x훂./cҺI^%0%`$f~8y; 1RM9cRvn-r @WR60whASIYMdcAdzbDd=@W r(dBbs$KQ|VRڒx%XAh;W;dXeU Kquz0ʬjSCIN%8kUqnd}7˥;RD`l&pUΫ6 @q6(X锌5I{BLH1xɫр1 r[Vbٻ6WJ܏#ݛŽ1I~ʂ%RKRj9=|5%$kɜaOuwS=UV(cϙ $#6HNY(g`B2XYi_lMw)Ī^F]Ǵ/at(ҩ鋑5{]Q-I:c"#=f뽴^u htס/b<|EdMc?@:EatƳTgjh<xpǭ=BR`H Nw*T}%8D],i uIAuYT@j|I1ZIXr)/4l\ A۟4mUQITDp@QJVz\%LQ8EٕaO1R !1BR<6Ixd90gJZhcKåZDGӛc~B-~t,dwYF}G ?>穭ޢ/9N ^`J39= udQ"bެb]7;j˚]4?_?K(2{1)S0 >+H63.-}9A[e0%3{eoe!M&><2O<M_dMO>nH Mt-|bztNx'|ή&Sؔ|Oj}d 3_SY\Ŵ7]pUF> :[eυ?]i_q||L&UkM77i,Ԋ6!YtةsPM-=^і3t{^A_@?&q6[_i)\T-Aaz*BnAM)!==OHOXvz8Y `puВFHz,$sVhD2@O/^n3orl LG2[΅JU (*5<)L*wpj/ԝ:q֛N|''TN}owP;/cnaŤ, eCk]l: [8omCpښoQ* o:$m27J>q#_$K_d )f"z{,rX 9E*ن^yR GF!Y\‹ 5~MϏT,ŇMa͓zg,C X/o)W+kv?gհ->w qz^zKIQ6]^9)H,Ѽu0٥J Pq4Nuf5 ›of-|',a|>_=N&O%ϵF8?N\q|GI+WHa4m\=lOIBA`خޭ%Cog9VCڿf}'M+t~:wf:\[LA s`g':h/;p2c<`Ç1 ۶d^}w46F㥨x0'O=gE.h>G9#!7Re#mD2ƽ(('d_-hYvHۡ#n''X}}~),D'y9EQ(c%S )Ċ~H!0kUe6X/2Cp(sQXJ.K龜kҔr#kͻ؊7jbXYb:2P]ߢIțq5 {tfefVxGӶWpq.޶B*ڤ֑jw0 _k@#c.iwքmۛp'w4!f unmM7Y٢;oׇhtK{Ÿ6: k= '{ bR5_ܺ2U}cOVPҾCkrB57[Kv-ۇ'gC=7%%'gE5sJ|0/!'E2\67#ɞkGŷʷNFh. 0PONd!t/Ьޛ8wG.%^*ά8j9>[}b R׏ss>'m78鐸{AYXM&^ %ͥ3y2`Mo/w޾;iw:jry c~@hӵ-ӏ߸d!4EdѐdqFQ^e'ȫGM28I~$?$!bkY0O_$_6?8=h 0gTqJ{h<ƣ-$^v^*'x$[pE+:C׺Q| [. \UiuRjwWWBGpU6voઊ\:\U)W^&<++4WXڽaW0p)^\ec-WNUV-RڼiUK=fo_OjD$@!i ҕBlt:ӌ]l&gmɚg:3]й?k&x5eiM VR֮]'VoYߢb'[)ΚV+kkK; Zm3~o3;\k%pf/PLݔs7XdzT#@Qº\Ky[ K:M ^"[Ți& .{Aʖ̵+D:8Ғ̃D0עxs( &ÂJdtP3%+'UZL׃oS0N,Yrp !QJmJ(CH"&)j}8Ò?q5\ p0Ryzqc^2ET=-QsvZ.xm?c"vf ˡGQX)'6+YVǃGk#DkԱqU08x$oS[5/`=Z|1!%2%kqOab $Z$ɀ_4E )U) :$K),\jQZbx#Ɛ+Lgfj^|&VL0ZĴO2P @#/&DBcv9\bRuIm2*VNv kiG@>%`[S]rgp 1kaZp >*vZna ǤU ,(yJPa H y<qLUt:.a[+Tƅ%HrW4:%ai.bMeAPϘDɱ36)E=v-jޮu9U}n3%̨!2tk@A9 qN0 \)!bD*pPRH 3k'D?XC90i+XGeұK"\VA@b8E7r{' td,Eyd(_uV yj2WڎebE ]Vkj&qe+%rW 1r 9 VlZ4Ȳ"Z$$VS@ Per9A!ʰ*4Q]j>|p.#i~"4(~9Fz3RTiŬH1&@T1yQa:c֠M6} *yX`׃rzhsJ ^]-^F$>Hp"P 3 xs:p~>m9@VZLAG=(]I4TR4P<9?X!̤BBgs՜@(@dUʫ f&deZCP[&Zi/! O×Ue 2inuG-ۀ8 OdAX? "rΊf md-@b$SU>}tWy,yP'K =`Ye>ZbM=A%dB>hbRmB%| Zc$`Ywۡk@)@d*5h*,C)9iDκ$!;V@r]Zx _3` &+{Nw$= ̓ "(YC Zc~6 3)Hd(J+12@!~ЃRA*8@C8("㬪p*yGP*LJBȲ$ 3A6N q!Z}EDiO V@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N'B zN-iKٴ0hw\(g'tiI @b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; NaNN ؜OO WϦ'Ek;):V>@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; t@KmQ5'•~6N Z ?~'zN  " N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@z:NOn@{og'?-VS:{sOyzs\^| KVJ!c\BRƸh8z4KOѥ7VgWp4+kl:QZ;\Q|pe(`gWns+Dѳ+R2\=Er6fWؘ+ )7We`1l#bNkWu+ ׋E]!J/ zFpfWs+6=\!J-#Sj}3//tݵ/uZּz7يvHx1P1g lD⣯Nn6\"^\9i狲jt-v[8oj-mK->ͳo3݂&6I Dw}u҄qA+='`rYX" , .rҏf}{uUNn.EJY!&6U1ZkXtf碐\JLGWnNk*8;O٬P{Gqgۂܫ}ॷ~ G~K6įW2#[S+pu襗fW7p \Q^;\QǶcps+ 8(Z(J# \izR\Q1nz.pE~Y?(ʝMKWу]/W슚bXnY:_SZ1jTɋ^|qr9_|gr )~(:?tRR~q{7z!N>d- Keju_ۿ_^ngvOŧ'z5O~yyF@2?l.3kHxDWc\JMN65!}Ez\BJcrP/>ۇ"MsɳW4?n*'[S櫓Qvf7!}צڭtk ݗoRy6X\s*pwrX 便ӛv&}jA B E7´.Dە{L]P쒢_.)r_OUa՘Y׊%N"Dt@rsTTt]n/|Щ@4sV&O$ȱEJǖCԬE+,v%ݶ~_)7W˶MSwmbz(z;*~7z\ܼwcݻi^8Q0̯[4E~ʷ aCy;A^}/҅R^nݳZ;Wm8wEW)mottԲ![VkL%3lp axy3nSM9H7eܟ%6ྕ>wN2QoWo06gCϏgzPܠU=qxP~@(3nb8>/d(c`G`3Z EN,Rj6wP(o5?hbTkDW5zqhq6^0s'X&STz0I! ʗMܿ4t@k,<`57lQEѕk)M& ~51 R Qҝ'>gwТ퇣w5#_OD|x{mw޿|ntS")Ϩ΢TEڳ>:=?z&j(˽8jۋ +ԬL=VZ5y&:PWz18~ 7QLu>})1!>\|Q;0Y` "4ۊFiYTȹڬR(:$ =thJ[}3 h+['lVlo~o}-%i=[S̯D1}dw`mcHuzPa=u +1`5B6BjYdX :ez֩EH"U۬UV2VCZՐ MxKc/`uR[K9p2V:גrҘ&UFߥUS)EĬspx+ם}L;3&g59|Q_{Q%d%ghb~t@N^[ܾwϑHW}ny.(7f|0괮wKPZPSjĖ ,GTWT¥7)2{=glBk=P;\_{̪&M $5F¹*adq C0 X@,⒘Cڼ9t;݇]_|`HOrni-VŪZ`mw^ frSɥzۅO+:u`H쥪YiE/PybN Dשraꭎ¹jYw$池dq,ޕ ;KQ5 uQRj7 3KBd/%pFԝ !Tr"B3*5TS8G`^H +mO7'¹GQz0 "N"5#"#Rjdz25ueuZeto:#/zޓ#W4Hۼd``2axmdIo0dǒmNHw"YU*$ۭJucER!FG@Ҥ 5i&4w|s#~xlklaM//|qE&A 6IkXD-&.)2"RQ|6;ŽC1?T7ㇷ–1W5;:#>k}vu4$D(*oz{~\lVc&Qr9A!@KK+ID92΅>ܕrGD a*I{AmoM$:k#QbrtRRE}%J_MS\h-˕7W8G-!1@XJ£J4tm S29E2C<_ 6Fd@3z1Rk4l]׾!>2"xyyuGP\vr bPxf2 뉋:ԷK}8SL@>,Wj"PΕH Q0TL1zf^L a'zҒx;,*< 2$ɹD Z%Lԟ)孠 2qt8*sxg83/E%,"EBYFZ") gɬSmDIU֝Ea[LܔdV؁<\*vI T[IUǕ0Vd^Sy,f CnJcP_t1kp8ؤO#(Q uHMDv \[?M9('ڕhU ]i"EtR(<"r 7]qf$Uz%$+#8V;E͂v("\"\N6:(4LjMeD.4. 3x%LZ-]XwΎ)5|mУ[Y$H#K;&I/j$DnDHs$kdȒ>EZ b| AYNtn/ W10ǒ< " N$8P70pH>^X!" .WV'5bɣYNQ'XP6\2E8E%#l\Ƙt M/?;hڅBDG%%nӄ0,h,0+)6y !%BvF0yc0Q($D]9WI &˧W;Z[B},ve]%?1P>>.~e3`4ˑW0(yVfܭ$ܮO"=.>WY(P K*xV5Z Djԧ=W秓:,5jyA~8USV )S%Nƕ+OJ{\T<Ū b۬"%7b0Yiv\,F9-[_N`^?T!{ߤ]xe4bneڡUMjj?]4V2R_Ya'uDU&ٚ*!h [8"穛 ~p]5b0W-g\0n~@~ssEjhr>K<(,E~ -WyN3BG%\pɌ Etœ{q7sZZBh/nxRfdJ G-An8JTj{sj'Խ8 'MvҤ=7=ݘy ũ>)~A)uJ)O橃+tJ)*mJ04}iQ;>-Wb 9 EWDA!%b TNMȁ%PQ{Ђ6*b"C2*ZF)|y^N6ypKvlXwos5tM4{$ѿӒittOґLvw'DvQqB5PQJ҂eV$G,.WZ2pPhct1B&$A)8O)SϡuPџnV}f9f9wﺻ+ 6Pˉ˂s!eo.Kl]?.ۻͮtYu-Kl&߽mIϫw;r;>L-om Ty6_qtpZ{-kfwۦ^_mMܥު?F~T+ͭM77\7\m^̷t*?COc86~6T>{R&Ϟ'p7K k% TGMքH.!pG2Qن _aalv_rYa0W,F'jUf?mZ߭A!ATl 6p 8dcD2K62湌RGIr%9LR 2 ـˮo6&ʬbN0p5@s؎ !u]( :ɢjofQmc8Md޶nd|y>ʦ:]==/9p :Q@#1c h-U1j$rpŐ# Gȸo z51C\Eš'w9W$ rFL0XZ$B!!9QVfZs&s2cHwjO8KNPXnj3pvd8~s!o %n.սU[\| eR 0_1I3J{s.SHǒ9NQ"J1P8fX!O7g$JŢE۳U.d={Pڡ߮.q|1 V/'^{dgjsqYM3j(6P2b9a,rF~V7=|j\jڵ&U"Tϩ<"{f^xx1Y{BݠUR&fIU]@%ohLm^5jn^zg8r I}(m{ح=\c-n/my\ 2Ly[fۻ[Q&םn6غ[i!5-Ԅ1kh Na{a# j4iB+@]up0N$wI唃' O^Asт% \3N^NʽJn(mŠ/O|͎½U89%E)'i4Op?5&:}}M8ͧW>.IK)7Q1_ JĄddIԧg;]i'`> B4L# tO4YydΧDMG)$"62mrFjL0O"X2@ Xg\f(b%#O}ŗIϲvY\*zNҴNX}ׂ9}p=Z.^`TBL/ޓ)Qr4dFPjaVJPgj{H_!)vnH-.ɗ EZeQ!ek`U_LJDCƀe3~}.M! 2746jĠ~촜E? [؞VtOd͆(TY(# .dYJʩZ{R++*L,/VGi\N9 ^gSi6sBSVV?k,UIx|\ սe~E"6Q\~]vEiAg:/y՟-\ͯɩ.B0HBޱSiK zub-unK8r"ͿMю .kYUc3C'V̷n_is&úY\}uѼ!5`muC?]~\uvOiq7эg}Z=_ݹU)ndŨcG9ϯ7 .,K5B.=z&ޘҠ<E#0n7V`P1R(߲}㨜NO]9rzL)DSJ6F!X7k,HN΋Q hew:AcL՝CJ^Dnwk-2[Vn}SՇ ($e ){! d$8 »1WNw^^ik5݋ʚG/ǒߩs=P=vaʥ.zLb'gC4^R> )c Q*>@%IEoSx5t>M<2Y&30Nmz#ڙLjǐBc7(ҟҍI"{I dr˻?_>syG_v ]l~?nj]1^5`6Pɦr<߷?iZ ހmBL<#ymAu{a\ +) a.CAE\"-cMQY5"546Zuv a^ڭR%{ACs} etsqdAk| <3V_r8:t \|]lJD Z'$ {2CӇi δ=#m%K*58+){AY P 2)Qh# e,h(s6M[M|Mjm㮘lrFc7"VVbNGt6Zib($"YehQ/E d +@I]d'qK,ޕF^a%{I<+#X;a ^ɳ^t۵կWUjklƼǮ-g,-V>E },gUZ~Xl5-Zk`a~>_0u0EYw ;j٭ yׄp;5<c.Oծ-b-7o؏wKa,/S $/?WWL14`/TwͶ_۟~xQĝFրG&y6ʛ14DZ7 oM,ESES{M]oѼB9Y`-UUݡU^ \ ޼B']W&p8q-=/\=JZĮ'¦Ay\z%:"bPGWU\0WUZCWUJ'{zpIX`WU\$W(i}Dp#bWU\):\U)kWj^o1|_]s0䋭'5'?.uEDoQ/!!EA6j,  9&[G/T0ω9Kc厍,gu: 7|0_?Է&^Ai4% NyMkDLK܌ Ih ERFi;wͥ(vil6]#?>>~R݁mpƃiM},`ww۪ @\PD_~mrgJn?PO~+a3rAV4uʄ ׆|{ fko)]Iͩ5 6%OڪN?E8R*fFKAS&t?[[">2p y8ՏYIDr~Kd"$)u<ɢ͐Xt-I(J苵[)3%v|jG[/kU'KĪ4h;>^֤&yd{l|*b{õ 6 rέ`xm ^\lDctQB^&Qg` _P]'.Ǿ}c'9RNlQ)Ky}^.<'uB"~&X+&_4̝ƍXBTw&t4ol@!JҨYdX:{ zΌFR:EN6Td6JFR3) NY's6(Js (!Cp1KUDJ/;NCAYgglu/d @j]_}@uVg <-eyKH,Jsp6 *jf,IHZ) mKltS+&{D2d|[?pJ褤10bvRƮ7C;gG'CJ&G]$;䵑0U0䅯ZCyKfwyE]ם':g6!55^^~P]85QB Ju.6GOEz*P'M(i1Fkng0^NQd ;Eڸls=%(xJa!ȟV&mG/R'r; UP9e3`X(,#B&kL 3q6ԳV5ՌgХ·RΧG#^@BJ)Jb wyձ*#k e'{R>? piR(dI,LH'օJJJAYVgT+^_vRo'͕Ȓu{6,Jß[`pfxX`ASbL49!Lm,VzwY-B]j$K /ETDE !&f31vQUo '~F_#©>j($VsFcp TxD3<qTyGmyD8xrB}L|u׏IВI(qQ0Y KMBe&H8Ba'f) mGyG2v\maȾ!Z%:Ag$XoƁ!y1$cr>"Pꀄ.hFX}78ͯ-EI3MM<9DK'7w(O!?Cv 7]Q8[4T$#D}0ji*HaJ f"" vrE؄ A2 ǨU1kQRk zqc[gǒX 7>IGtñ,hf t(ؓ( NE9O1 VRe UQ+^ w$qZu& R"8IE$s2Ja,0Ǖ# (" (O QPz, 2 BYIwGaDS <JLm*S2gaA()}Hݬ@LlCO}%v!HD F9c>bmAE*j$B\zRh'D;g;Bj9o=k= ) <c $@O KRF8TO;zYB&6m< i ݤlp}r'~|:In`2l1.3w$}juZFtX6?+ o @m7H33p[י^ʌT*-r/;-j8o6%J^+-s}ĢP=q ✪fF;}!gAJoKޗ0͆.G!q}+5mdd\,u`fӫrY YInInNۊ0Jת'#tr)Mi'f2 ׶{2L> }W? F<8@+*}4 \ k.Q8,γ?~8k dK~gd \(6E"~mEikQ64*b_Y{c'S$tɛ %l[-յ~L` dz~j̀Orgkf7ƥ*R[Ue{zB4m\afJVP$U`@Zg)e4NrxI@:wc>@#@&Fr 2, c'`"LdX`ábݹS[qUYk8DT}+NwoPOsc.¼TOW\Bks~9-\SikThmP&`2Ńk.H`b|.y/EK\vDEюWfHyQʥ"I@E)ا7 Bgſ/iȋgɿYH^<oߤYq8p|}VBz8-V t1)(@ޒtVȻ%p=ŇBzvVVϞWyliSnr-:Ѳ2o|]gQc#+'?>yST _PTiVLgdgׯ[YS^`۶'sSS$lnqh<<*࿞G/4|u)/y 'aI9`1!Iy{1 BYk_fKa.[QH>0PJ_R$NSh>8# P S,yZ Rlأi[&3ٲgJ >u{@nm2toCqL۶Of}&'u1]0:}zI1ٺń02u/ݧݶwfXa慑a<nt{(|ͻNMȳlmNUo:iP^~=tοt m=NUyr\ȕȴu|x3XS݂.F#;jqΜ#XZm/?7ȓ/ڡO;Or(puE B,!Jku^k,!Xq :pp:tKb Qp>uQ[3TJϣ #ElHeQNcKp3BHkKI O-ʈY,-,xGI4 GZf C48FG;]yHo&ɻ tg+xߟ䮬^eQ x|diF!CQ"7a>(2$ 2F܋1aH3Exa >yyR^8|?}Z\oee庘{`iNo?\ދ_{sQ__}X1|f<ݰ|_2k3m2t^PKQɀ'WϻE!;_aпgM'g {~S#i)eף߾L1 d*)X?)ho{_K=䗊\x_o+Ӭ&gK}X 4Wo)1:!XL± SBоG3KZ_"@śR_fی)`G =\SyiYʚ\zp6ZxTC]MTUCU]@9h >J43486膢K\ǵ[*[ϰXn<g^rCW5mjO)Mf)] 8S<^Qeju dE[kRjS,!Psj8 X乨$ъNʮ`DpPDO?ҍ7lSJ. Ί:mgkv=s9NIj])IxԏH70ݬy)!6Bbi\ƣEAD{CqqR5(|Y<;BY"^Z_:-64ÝM/O-0΃v?sMN|31֭; \IhM:R~LVG.Lm >:;90Li[6f7Py,3;[rJWZ#hܻY;njA6L Sjd0shž&%T #7;S<w/m8;jj{-.O ylu b bze8v;4W "}jD S:DNsܷr:9@SmBx'DP:54`B8n ` NR.^WScZaz ]̏7@kWo Mz'\RcPJ5\ l|!EZ/Q07oOq&76zp*1U^3Cr9VQ3NYpXwm ̎Ss/Xگ5WD:b=(t4qR(&D*NikOB]^GCJ9ō8`; ZDl f3ø)avIPƖU%dy4taa=+=>n;.msA?bnWO>bJSXTb:}RQ \u͝frɴ03+>e6EeDt:OurPErPQua٪g*ǡcRd}E֫ @!Ha'ёG!x%=8GF|ԩ1BRԮ+TH:"b,t98cx,e! @X .d 1ԫx3 xD{ |KʔπVI6Q ^eٮvՌszћاEWפ &uD9h0Zy &zG1Qz]zpY뵈|k-u^ޥ#XסKwuXr]IX=_<%&}_x[&"WaMa\A"Pda .]<_o . ^vG7钮imUYM{i+]?n9QZPs ]jD}Wv֧Ѵ$FiZKri~3mA 3 |;WHCsKFD]׏Rr^D=v[Ij͕sYj]Qr rZ ESuV*`8Y;~,mǶJE07fz f*_m-C^:"T{m\P4YAl@B}LF@t{a{P8BԺ) n$R +5p(yoDUsya`7 w5QgڻLfnUhc:'(hR"f=qjV:)}DPLw{kFOܘٵm00Rh!@bc,xO2OX-crpX}V+ d ӷȵ$gI5HuH{Xv"Txrz5j6iKYb jFJb)ûi(R%*jcٴ|D;==-$KdR R :Y#4 {ʁJŘ X\IEj6=MWwX-m i7=}11wit+Y BAX` Qe{ )d`rкSvPjB(X&31gU)ksI3 set#&'Ǵ\%zWق݂Y',oخ>[zUu):yh0=գLXM&LW&ED0EJA[& ̄1F s/ggynܨg >Ֆn ~v |~5wG] e8`e}9 ..]L% UxS4+~痟~x}喉>eM2IQ\N/gñϟEJalOWF0﮾N`쮾J\C NZn㮾NJ82w]v}LG  z㮊]i%;vwUTU^+swUwUUUҨ^T=rW(0qWE\f⮊@])eCW/]:8 (]GE 83)\ƓW]v,KQ:eǴ nf etezzz0'Y@)q)Y 7."`EO?>^~V&]-EXG}e_WF6Oű-n4?`Ma65ӓ;q0f:MR|̐4+׋(DqrW8ƿ.?s=괥]!_[wN)oŮ'g UҕfE+D,KU5?_?1^1\KWM\?'xoSo_ ~K{4b,bnvjN%6 <~X+hŕiHG-4!P m@Z2ZG5#rZg*'NJur"y@$qH SY:Gi1*&٘(RQt3*ƻ@ovgt1nn_=eK *qf Ůћ;m77#_qEoŋZ IcOº t$@x-\r{L[yV`_uԇ/hiM BV 1TYbB)]2%FW v +FM־2ߏ^?_XAr?ݶc[3=3E z6z-Dm=Q[O,7+k'j˄m=Q[ODm=Q[ODm=Q[O1˯d*$O,o'1=Vzj&TJ@ 2o4!0VApcG=sB3C=ضׇ?qE"!d NŭFFf:e-N!YϪ#!%BHu),Y[ Y$NcBk˵TInz#Zfl7pqHLQ3/9q2w\'.y = 6s #J' A2C"ZI~Pv& QǴ;sc:/4ZIhV$S \Y$g 43N$Q粗ѼuCUP zh桟IRAèmJJS>l5nAALIW{ebe".2NVcSԦUyqdԾ!.Ε l1?%᧋ $JJKIVfbsx4G F_&IfGG9,IQe0f)bq$xTL{< Ř6*I)7̦=0cp[4F392Y\U&y`(׽.6Oߏ$zvxcB(B[YT1&e-%pӺ; ZB 6$%+x`!G#%`赦&T6jl7~k!FFnRR.KX5=) l<> KtKWnWR\w,#~lK.mRaR7T}/UFU <*!Y`G)&lr0&fH\BPU=2:ǘ@( ?pAMcvrTg?HPqǮQTyxJ~ gL\]EdJpA$|6No:alNkJr$A02Ĉ: $4ME2! 栅 BR#Vm>$|,:iɮ~QVbh N6KkxDLDkOf!KdJ-q)_܇_hʉ ҉,d9? ,nb/?M2+X<y])f+XJ`~x8۽dz!0%{\k};}ɳ_O]T  ɄvʌJAIƄʒgf䟥ib<: :]jHP!F)"ւ!ЂK4$tm3홄١vjl7+5{\cSM'm!O>eԴצ]o#GrWd/rg猻8F>ZeLRk署ROJCi$ Y@aUUoDm&U6Q-2pvEM!z*C W,YKv8>=,  b{ -or]psphBW.'e0^ynl 06ثL0Q[cxI{A^YJlڮN#28D FrwyotT:==1I"zf,2sK_{T~SǼ>BgPz#u ԑ )j3HbstT/($Oh+q YH1F1Jnox> lXx\Z(U9>D:F& 2C2~+myV4B GF%2LNlcR|eYDl!KߊxFNG9W f5ΎbT9d2|NeJR .cEК%8?@" p_9R(d .ؒ2Ree!`ERV`UT<NZ S% cbQR: Y2XaB :8AUi'vJZ_Q2J}p"IuAY 1y< p{VAjjj<"HxaCr7{=;߰,Wzl(AѡhbE$1[c^,+#c2[1udm{U) 3KXa b,-4-DdT$WAܶX3] [YzEX;Et'9 9YKQהqqSR^Fk hܛEeGn A*T "1TH,9e¡Mev%xO1ԯ =SHJM-\1gJZhQ<,RQʞ*с%vP%+n?kc9iIȝqN ,i'Sr\M9z5Țn_Ζ/?Gb=hŏU:y߸1T2-~34ufיbn 4L gqM?{ M: J6 Kj +w84ĺ(Y5ij:ˆnё8/'_ryF|UZ[#5"Z>4y1|7KWh * {ܢm%jѱ;6ZR =XOb{ʧ|տhvf(%"j)ԗQS\´&7}ģvݎH(tYׇ6}򬯙}׸\'qpp',OXZDCДEr-g=Y9vhǍ75J%n9a?3N&Ⱦ-Hk2u,AJ>qHU~uKO2А@3p{rІIT"hsP I:i 92:2*@184Ic:9A-ŇS,~^[ ,߷9 iG MxB G4G;D**?r@LĶ 5ׄ R~'>/8ai68zr!qZ<\ɘ|.=Ly`i5'~qev>_z6}F\OZ_d\Sj-h> ɸ ?4@Q "ܔO4YvڐB݀B ]_DTJf5IeN)G" Y`dZ;!e(|oiCVEXf\}d. KəeIܗh=jJҳ!9ںaP o_yKr}-ntݻw:R Funk]! ]غuմ{Nmrlb [wE;o=_jy7?g;[}zw(Ř1g2 bZ Rn7דZi8|in ZnW]=Z?".H<S RNB|ٓn/ĝrHRh2":C\qY1@9ц140ӕc u Ѩ!)r5AsiI8|KEt(4O uʏ4"/>݆!(k.(=B3Y}86YײV"3ۖ&r?-Z|,\KJQutLMWKk{k6㲶t4 Di9.'u[fiNtX:/CL͢ DӲqܚ́(RJRVG!NħDi.BPR햗 a":)Uf6e${=¥4_SFyna#z-@!{=fC?6U#WǬ'<$rx/P+X}z0y:o]ĦM;u6XI)]c}q醽!՟,׷JYJos~MǛqGb ƈ_|lvH2 GH(-.ɵDÿcb}xbZCr@=W[|Gbːvún>1I] FN^ΰM~ Ȑ>_:Yu5Td}1F?C4̭vrrsqi<,fKZGuədQ(S&Q59A6y.%#/r6_ ,>Hz83TBI_Td9r&=0KV<hK 2XV) r\ *soW/=>={ U;|D7o{/s/:C)`%ΤX$ragR, =bMS,BΗ')l쓅0m E?EF_ϧi'yQSgk6ӼֹGU3gUjU>ނ10q)w$RtDlKGOM&fxRy&Vs0-R6qݨwd*sJ *BAǰdw1vpoɉ( #jͦ2h0JJ*NWΧR5K\ \!jwzpETJ;+g"r_xk;j+rhWW 1K;pE+pE:>tB* \+"X\V \!Q)WoB |?t"r% \J JFzpKpE;3p:3Q npETd-•]r#h \rg|WDS<$\YkqO|Q[]Uj P `oo@r7F3?&3`!+,WB6P(d!e+X3vO$X?\veDjwThK "xHQ;eRpzu\zǜyrYچ`K ێ\_7vT pFz ;WDb;WD]+֊QWo$ʚ+$Xj;rBj C+R \N\3L؝+"omGv]koF+FCm $Y .&d$7XsHcՔ,Kd2e-&"y]}TXQNP]egr(2Zy(eOWHW8B^ ]e`JUF))ҕ%,2B gLu2ʕ-=]]i @KRW]EW)F]!Z $JKix1 v4jZIxf%*̎ԍ1_UUU"+Y֡Q'SW)fg4 %:fxq dtݣAjeޣ)ܣd7!]5O¥ǥhq^|kc+r;]͏jFVVWDW0]RӞN(Mtx-]!`cʡ+$Q ]eG"T=] ]1ٛ.ZBe=rYhTV!J{:AtQt!8-]˞N璕ʀ*2#:6C@t2J)UN8BkQWb rwO`%%2dCWn9th* ҕL~mU Į2ZFNW%N++ =ׂN/ +)+No?uᄆ7Wڕ]MU 8ضސ@$M0(fp\!C/0)?TOfʇΉǧfű*VnV)պJޱ84+j߮DWb*h:]eZtutE&TDWU+Y)tѪΫR랮N"dAt] ]!\Êq3Z!NWet&t$g0Lq3\V3 u({:IJQ Xb*5ev2J{:AajhZ ]eBWRw3JFz:AR7iZʀ RW@NW5XD8"s&qL憀X }xS{{Iؠo^L>?ϳt3w%mnzGz)BZz;B^N 4Cɽ<ʻSNrXJ&f2j oLrQn7߆c\ZP)H(JB2Q43:E8 828mf&)ܜ;nAŔճٵ`&Aۻ0GraZ!{뚔vlvY̲wlGznKqnrC-4ϻ/Yh~4,,zTK#I*M_YTcbgţ QFĔxLAU\ Qx5,^l/%&Zx;v^X&QhW)BOFh4SAd2y"$OT ҃E\ 8I8s_Lw O]7v*Ǐ}j~Jz[Ry<3su+M%e ܦD %ў[ހHY#Xq#ښKH#:E JQ*d2myQgq<gi[]b䝭bۂ}|Ư=#y()̳|4#W8![jdhQʥV+837gq U:t$zQq5ee%6Q$xi֜*Onypl/],JUl>TAAݯ+g_hd^Gph.IQK"wF-xvh*hV4no:SJ-Z)T>wmSݽզvt9!Lpr 5ŴC/qRH\k%,FVTkVqK\嬣IC7#}*E]]|&kroWdMl{a> kNwM/N &UJ|:%^BsZ@E" Csɚ$ Sv cH&,k.>vx7[m$u{;%W\`ԵdMF+$9V+mhŹY,* ҵHHyXU,uL fx9daP$HRhH&pv b5ruv,YUH ír|c곏[Pf8oE.mm[&+>am.Ad)b.J@|z%l-KVČ.x'{X<'M5+GU_ E KU2ĸ +<ÍFڟ5cLڗs9hR :!@p >6yP٣Ɖt蘀M `C_N3n(!X>0yVخAТ)R*⨣8FՅO.<a xQI\b!- IU)c(Y ZJmTdIED<^xIjSOUJ;G)b28٬ӣBJXiubx16. ZTЕCAeX?^CCnH;㼁6w!QRQ)+%XyB!ݫa/|k 3[pҷ61ժ WjJ@y S#jx2u v&U\ tv3 >vbL&eFDj>3+e1TV(cGB$ EzT md!sr,.FΞU mz@pk5MQMmV_nYL(5fgK]uPmtN6wQ K0^J dc9xq;n 9a$IPKU$~a=V;"=aGgr~= Ykwߝ%oі7%JtƜcՃ vpb,L~pvu\ўr~g?%=H:G*DLj 銞4ۻ"8WxA_wD<;f+/υr7[nNY- c12h?~4erKg=;]'ruOT^pΈ=-g=˹ X",Kf̀Sr}9vzS!j ~4u2O3'puq:puFHz, 6y2O^&zoǝqFF9#^sa*dE!n 8< 8E/)Z%:j0u ?u^3D$m9 B.~"~b.6v_F4{2]==-l1\nBȡ/Zf}=f1r7?, >ۼ~}x~ijuh^杧qaoA1/jJs|K2K5E_Gh8K6[QV_'^1&zntݞ%qgͧryJ45?ag۶'l5AȄ.ҏ+E '5_TZAI UR&)V2[m+IM Oݦ4{Zbrv7Y?^qǵW=I/=*(֑b5XzΩCLqV)&DI\,4³hc÷,9rT`1F/t=̂1rA39,8hLh낖mf!iF;Cy~\%@?[;hpeOe\!lUW0W!嘍^d-KIL%S&v,9PNO/Vd|lU=ykːz2zL2DPL TOsmP\EXYIK pVҤS_YszGɢB-KB 1HNp$f-($#I'&&#Dwz۴"5 b =Dԣڞix!g"+K5x~ygo\sHS2vR9yGz#Y1@9ɇ14ǫ񧔒W{OJH'RR!i$7?% KA<5.`NEKk 5H,WTߚe'krMĿ}%dÆ"Z[" hCV/VwwӘ8_`= ~,iFYJ.ڲLLD1RUҩ?͗Ӹ:yzXU:@v8[mh EJ5(Ь)- Iש`$LނSBlY?4WEA/'Sp)7{Xg旞+wL͊jчGY?[SDny*>=j~ޙMOUsn7scw5|W&X k}ݸN|[-صMDO8v?'x<ݶIWakF/e>OrQ d,#/}]wrvz>Mw,̒LBhL} ʕ,\,N?Tiyg& D,^9{Txm&1i~oX٤c*ڳgѼS7rí/DԈcLqiv|zr{.uv7!uIK |t3" zţRQ2WkaRi$o0+ySJ|9 *U8ϸQ>0r&U|)#A&NIPyRzU^OQaV'a.8rԧ{V G^q%GTDOꢿR]4HdNٗF`{7ѷ.vK MZ81%yn}h]G=%&Ƞ(dmx( t 4/ |&8VGO{x%ˏڒ}ӑ:ډ$YŪsHSk2F FT[ E 4/-Pٛ蓬f5&Ҧo(ɔ"ҥXf1/e0rў;wxaN_^@MA=ݛ [ؽ8>YmH}\Ӓ7.t*yyZ^,֛V7-O};xV߽]wb\sRqXksEe=ƪzF 3K#{5*-V0/:ä*]%)Igr/}#X\OhX9ty=; chY</g7' K ]H׫ 9~}j'{ޯ{kOMS7e`q|,+OݷtNT'h9{GHEʇ;v~BL $EͯY)q,z5fS?}u&OVD#JҦ@VTeVfkQ,&ڟ>_{_*Gawګg}?;>-jQf!l%ɉѪ8-%h&(A͍;ҟ|_ *DW-JHs!cZ"P@)VZ (6>%|'TgKHqgiZKK<9㎤{Ez]9/?~R/SԚ1$kFH)L$e`Xh!`L"4=:&k0Ʀf`f j$ :)C)Fa]q h`@j^_2dYZ+mH9LФ4IDPQɔ\ #0i#&8}ƨ2:ּk94iT[ܯB,=ÄGzS9D'&CtsdᝏYl4Zڲ:$DyB )SmB>q.ڰ}ޜ4AQ55ksH9QQJrM/Hr-; u}dsil R"vUS5}GYGW(}FpH"NY@-ri)#8DD/ 2%֦R N~'@@ ФOJ=k]:gICR [AioeM1jd䒳/!XgM(TGhU7W*KIndJRP/q*)c!rNc%'t*egV?SCۆuqx(f $JR-(jlw%d.OZF%"yBӔ#d5tą%jH,|]VПڊ0i徆1i0EiޘD_U*@*#rDJC. (JGAE^$ݜ6lWH$(2R#!mMl`'I9{@z=/5 !IBmTz5aɸzkukk7g' cDC; ')A_0!Ì t&^E=s>؆ ^oֶdz|gw!4%MY # G0^-RP8tLJ#K^J.G[ -*``1qL΢&icYN k P$RDM+ʫ2F䓢Z0҅icG>SA^|D&H.:t_R˨׬(2X`GWCƂչY,TGWgj@bQI9m,9xŸ́ H6V`a;jy\E4Χ,}F_b=^EDB.` P @r!P%jI7k)zc H"&}LY%,h%dl9MJ6?>C:kϢ;k4Q8FJ"5؁lJ&M_z+"V{ 7+ݭE a {oL:IL\@0 >Y tStrgAVA)E/bL ڊIW9F DO+(\u"Hˤ #f@hD2.$Lčr$KF9+u{cgXb*5K}&b!}N؟$Ҭ~\w#I!:N.P,=wQ@Ǖ1Z`4rˠvFQ!#`B?0Հ[oߜ^ܴ\^PH;Fz 4"w L/Չ8A6'3JO|%(fxMo߹baQ,CR`'G sW7X/WY?@HmX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@q@ B:%̵`@۽WX #*)X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@q@I (B<%}Wu+`%r<۳@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X * rdk(`w{!@?H+`V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+~%W[~?GzsszCoݹrwӋ@F'>-F<-aˀ ݝˆ{ڰgs{]!lL(ǘ 1N^֞`SKb,ecvn*]/WoU~-pI@Zi\FQ\}JWy00[/WiU0v\u;,U]LE ȋ_l~rgC|wh|?Z۬_3 WNW)gך Σ?kgJ;$/v߻t~/vt`q|wuru.WA.䒐evE,!wy,5t妅x?x* gf:]֮'1u^ΙcLj{}7?YAiAX5p#xЀ.U Lva\GE 6g}z<.c_~ͼq9 뚈ZxŵyA|yYk7{wDEb{.n'G!r8^3b gY> 9q3kTZuL$Q04BXN J9I^l7|>%6:q~HhF\H׉sٽPkV4ڱp$Zި`F2EJ]q%U P1U=|ԳK9 vGtAp/ɣۇ:Y;@h0DtJ*MTKV)'BYHxp%KF#{mW>qvMaieǏc_4q- 䄖cuh%죕>Z)fJAﳈI\:Yg0)"沗IGUP YfB%J[۔ύ]]j;e7YcCHieXi\7lh016MceIمFZ1^2t:Murs>G\TI)*rù厥hH癤,NoKrSb(m̂"o5b Mʨ1+:K$uCdhY& 6$/ '%o 15Gd&TVj9ݤ[(&XJFnRr1n>K\E599)b<= w4|2bZ$Ӯ }kcF*||.D{`[)(U$!#slHϣ"RtHh'mrdF0&9tU˨ Tj,8I0f!:Q9KPs%]&#U2VaXT$D[CP #mrFަ_n`d6A? A;-v- NjcHvG˘"`-!2:Bl%ˆ,1"0=(fSh Zz1c3ɐV2 hjouԚaǣF1jWs[x EAdae,JH r 1b"8 4Uaa.xC2!C.ElML,hH`&H6>pAƲ=miզsv~|&AL0E&ZDQ""qI93nr!02dR$|6uIt'xŜ0j51qRRp51H:LE2!!flMEJ=Ŷ+9uVSleo{wE甔O2`YYX7= *jPp8H)]|x6띵T{+C}=|,{}C;Mם<Elɕ0Y!Ap} E?ZkV+s%. ZD\F%ȥ)7XfBCJYgQz]@f1=^WőxUbdWw7 $FAi#j-ehI8mH2单B7&('B^e d8 {\#]Ta%^m[z+Pk2v3otv95VoNY:zo>*B^XR.J1$sŦ4 װh dsu~|0MtPUi/ ["AKʅQ"ȸSkKF jeM*d1h(%O!jWfM?O $i9>%t'<6#ɆS' UDDp̃׾ZJYл?f̎Td昀ף*Nq9Q:pd |&@IzbN:Ei+qB[Ms+_zv)dɈ,I6imPd8)2)6 `O*<]6 bpq.hL/uY(MQJT+M@#WkXc!שmEῗ+ W ױ,'>~{k[q$x/4M;x7*V1aWN>Ledۋ;]~/~[BA(o(vUW٥&划fɴŋbU) )&).?*~nMX-g?ݭL$w^&o/r n@2Yo8[EWV_ޥv1I9MY?zw}9ƅj<m$ɿB~? vv>\ŖINY_uˢL˔-;`lUpk.e!/SޮH2M@R*1V[MV=BgRTL&^Op\ ]A F<8@*y$ :p}{2 yx[oo#gMF' }LyD^WoqzeXFؔޟԸbteg4Tݳ'kB\*\~L`5ybNpA6O ֔xA4Yƺ{'t ;_x0Sb3VP$U`@Zg)e4NrxIt>'-7zq71k]؎`iP%8͗*@Wi LsT^̩{qNm;2'A=xcӊ<3yz"T:sW`W^<_KYow"Gpt#^(Y,ҮlG}q* aKm*輥 ʔxxꋯZ / 6qA$ njgE2<8C HYx!xbPW13` 8ĬWzԦiZ{}[,A3vӡ Cڣze)/ 5\apja1+[VYw1%aZ.UuN;g^P%PAbBRˤ3 X??J~W>SޣSUaNH*%}p GABJ#X~ T,B*A S띱Vc&ye4zl5D*гd8 u= վ7Smߙ)FIk-IWWcw1x.at]V7s8㥳1w,ظ0Mݴ<|kCc7/܌h]9w0v8͑գw؛pYwez|]-kmz>1VC'i5nDN"c{4 ztF#Y:jqɜ#h<M+i=xNOyq=̚RM + .m{@CƱڶ?'ZoD;N9 Cޟu4P 2* L˰!Jku^k,!Xq a'!UT?01(PD!T=DlVP)\6"k0p3V q!CD9o+zO qxݔ$ywޭn2w!Jϭ#3HC4 a|TTʐ(hq/wXF8 Cq-3OAhe=8K2 hE˓6䂗pz,IT8O_.ޟ-:,h8yb>PxX!O8&D@۱0$Rqluj NSU$3d@Q#ĺ,ZlW! Cwu/on` dORL<p=h}*셳q5FE-e㎙vp=X I I 4mSjB C 7#_ܣNwJNԝdV '!PsJOJ-;VL3 H҃"3˫7o[ h\_LPwS }Z"|LaFa˺[ fzp"W8]JMh2,ѹ*nTW'c̱r:kg0>4U]]o a 8PSvj R55%e![Mj5e(zm M Ku֠;ϼmLʷ5[,Tx_SmM#W%[ ^ܞDr|aq 4wO׳<+e0_jLtkΫ5 =//_Ƽ-}tS K ꑓ &E\8mNWߵ@~%\ֈm4 G{dſ:bwAUݔ``^0oX u;V4r_s&ˎg xqҮ;εiDZSFN `S Mٳ zr ~% YG5`B8n `Rr-:hҜ2L0ųrJ[]N8{x=˅{ϫq&􊢔2oۓyOSRfY7MJD]O!3(%(FZʏvRh"S7Dы;-x=)+)]2]ٯv){˸}V\^{n-XD:\:WCٯr$T'RaN!'m7RX b8 SZR^b7"jm*;A@씲})V%,Zg_{Fé%w++H*$ Sf g ~2TSgWȇ@ {%'4RZٿnt|z28zPp$3]"I2`4=#+ҘK+pvZc d>z!!GA%+帓Fw/TM'7Cw }"Q)m-Vqo@L-YAe$XJ"4UJ IJ b=4,c'=Tspz٠pVk<}H)Fhl(:׏`%eM *{̷/ٷf/\g3((ػ۞vgI9j:RjKfmX3#CC:]lN^BS۸{:J{0Y*X.#@:)`fI]y&t}3UV3|tƥ>d(E\]=۶ڛ_O#eOu=ϷJ_o}8:f UvGمv z=HZO$iq.ڙ)'k|}tF(R!J --% ?}i$lE;nqb:n&1ĕsYj]je#ɹ9qRaLawH%Ex8xܶjYwMzŌ9V\8|A8C 1'QQcs҈ȍA(RZZdT fx{!Zz{ ;u'l%+mq~͒ެo7Ғ 1]Fk.Wgae,:uyߋS2`K.-2ʷs% ZGK!zƌƁZ@1LbaJ~lɷ1ʵI_. C$:K|]/ۢl?zO#1 &z)oR]^ry_H{cN>NMGLm 0b~ V{-UdtAo' 0uu)v\^T2i}MF,j[T[8x8S@_yݰtWjqn`g&wtFq`N:>hg|P,XG<3P *ON$l'al;2>M5U ؚ8:j!}u7K9o0[wя78< \"q}àd6t0(Zm\zt&o,3sOwPVjo\~  @6noeM؊eV<[?{Kie)ޅ\!ɹYŠjwŧE6tr^PܩLˏw YuJɫyV s\h肒j!܏ij+u0;se\Xupj^Jp6GIG-'o!YcmOffSN cͩK-hy `$s(7/Ų~"0MȾoZK2O;6joq <29߷:TTPJ. ǞX,fT- 2X8l([J`:|GuJ~Vq&ymO_4p$8 YPJIdSA⥢Ya96S&aϵ,lVWJM @VbXBlhz#8'\k=qT-8x̢D^Oaj[l `c~.Vy-L+bOhe&P <fhPH&0 e4zl5ͭVhai՚"rr֐7m#[Ba 'sjAA2f& ԢƇt^p fH cF֑gʖ8 2v}QzlO"zqyǑ6 P>' !Edk?H*U73rNVZsS\%Xq36[Ӯ=[<ÙWZQ@yph`-n6tҶ b&&Lc*d?ɷ`s1 4Jː&U5jOR7:1 `17B$bnFg^3i4n.$.hiM۶KGdI6.F)Z'OMʆb? !cV+d1U60<9f /8uT\vFaPI;8 QT)(M4u%d.sk#*ޤ8d,fе PWVџ1T1yY`0XpKAc(JA5̹k>P \r.vڒ*` 5}c[O[TuLr抩˔ lhBi f~;16lk,q,> vy~&PՀ\I1~hd\AS@`)ĐiHY03uH*nF)K<;1 :hl[s`5M|DbP*c*ʦ[ciT66% ac5ne ǭ`I p2 @PUv 4uEJE(UbKQ@+ho ~"RDbkL]Zʶ23%ešQ$"/=kO(@H*+9JOQq@X*+T햡 /\5$vn댉p0 d3_֭b_|1ʣ8I`RTRXC8q"^dߛ w3lۙХ˶RۈPp-*䄨G[-ky2AT瑵I|tx #/ f@tdnIKDBKJ7AyA3 vXmAԨƂBG1 P$bDM WmOAT. U U:?lq@) /. D,/x:7VTlh .d,XN~T} yrx*NEɜ/+II|'Ӈ6}{]ݭΡN‚.&jH-6 \{#@ ztm^K x1* 7Db6y.e}trQV]k)ZcrەhX|*1^U 9GŌ:hX%Y֨48.8t-I \9viEpQLDi4@DL eDAo 2ZdpX0FaE^Y` E*C DYl9i? 6C:jϢ;K4YF%Gj3Ko^5 RYU۫ ^`VAc-B@̤I@ |^ [:[PbHu+m0Zi3[FŴ+oWҮMgesMc&A7@`ڢ+Z`8FL-FTa7M;5,FhZ\@Gɪs7zSrx4G _6l[D% XR<9P.oyDh&? Vr)p A(P`AHP#F' \ X 7nkdXllFV_&+lE]("\x>W+f)ķ(N: ѢaT( ȈOZU3:27hâHYa=+NFp `Niu)"725H 5j*\#|rHe<(0SLGjJҸh ҵ5Ώ;jyѦWg'Mg TfD{p$k 6PX?ǰpHJfP z̀zQreߟF`J$nÈ9>X2QV7jgC1IvG.u/cdiVM6O—H"r é K]q9D LBFUSp] Sq1yC^B?Z8/wQ@ cEQ\6JvJT ̮V ,~9={r,U]`}F$?@~_f_x&uGGwwPUHm~JJ HFS\ßZ+`厜@WsR@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)>]%OI 5(`dOF kǯjH I*ڒ@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H *<JJ {*d@ֈ^ +wk@HH DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@t@` =%%P5(<À=%WJH *PDJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%G t}i6֥V09z9kKM[ewqzATJ^Orĵ~krz쓬Lam]3IlY>Ƭ_g9 YyzԾUn\*Oëd'kuI?uf$57jLӲi"is[-p휎 {20Gi1HD{i]_I`Џqko(_fe+|9uX/S~\2_5m;ةMT46 FF#Gَn:pU;<0EiL(?n l5O1bTdV.9'|}[1 i}svy lm :Xgk-Z 7}2@[2N pJ"HL[$dN萶 OݩajG715ƩYɤoVSD6we D6wJQv\tKB_\QY?cWCW߼-r98~9;k3/=?<+>ٛեߤh˼ʱ3I$JNǢ g/g^lR(Ơ_βfzѦߎ_/;a#W1pܗ6ijiϿ~v?<(sOe6Nxs _n .?OO/n)?J½Niy7M( N>~3v=V݉ǢTm.qՀuNCQ?oף;`y?u>- Zp6]6j9tEA+}NOw_Mξ"|qQO3x* S0+ϳ͛>nY5֞~Pxa_|lrmomOq>riۀ$_L -'g|]&r2R^*nx@ֳGzÐ'--JB ۞^<$C,1)2Xel[6xg۹x>Vja[1r旧y>/|rsq|R8N<;\ˣ5d9Z cNl%*hN7k[Au͕)]Hsu#}V7ǘ A?JsO݅*#bĞU #on]\2ZoCG6.U{kLʍ oɥ7GWs#@m[I9hPoPj.XA1t;f)qTxn~}uͥ{nɕ0tvMgkDP͢|{Ԏyɛpq'p] Z+JdJ򯱢kum+R9]/ieǸȫ>WvϟV&}oBnv?/i;cmVjFJ^$K%).ЁEP1^&nvvFɽt;J/| hthn>󯏞S_}:h_OK5P3.?~}@!v^Jh|τ:)[{[m۳vxwGqR3>ŵpqhC>Hɻ"׃:+ИЕr:Z--N. yR]wE2q_tFѷg/Z܎b蜡30NUG 7~ v@ϗp>^cٞ%2<=Y#Щ]\r@e}C!m[G[};dh.II D;@"d0)Bib<"bbd}->lo6ȹ2f؜m\'a[O5pwY %a}iXaO/g3@ qyN2ueRk׳udFns<;?Yw>^G՞;5,V@y:.`=8yf7nӵן>so?^3Z^t }]msuXGpkow̅9|շv`z~o>km$9__FC;F.` ¢}#=|T gH#)QYnWdWoh5;,>^Ī20 fe bO9j1$jqx^el\«ojÀ7kVw7n*-lJ *#bVlV] P\U&HC^f3YN2rj@;=@6{ٔ,IhЄ,R̢ёQ*stY%D bKwpB-лp~upNj 1!͋ۿK 1^"" {5)RJRv$Yf 'vlIEG&ػ, q1r u5eT3JN9P|= |)X-Dm_fM~v`*! H!"\M*C.*WB2SY]Ea!'ݲnx-6O&.8@ZSފHi]a6$oِ ĒrtRRE=Jbu%6Dw9'Гpq× 8[ t f4Vq<51Ưԁfi~#˘Wƾ>*;7XFʓA9=؏$lPbN8X9:\[jI ¶6H+$x|<ٹ{~dg؛>pE,%^0τ!6NhYPΊSj;6FBz9)cor] S2.Yew1+E"HND|^Q=;;GcN+r69&ʹ14Bdsff#jj`' vJxOkK jXjDa&:ʁ^*oL@ ☧+FNKo,0 pqommг:kxݤM~!~#"y:8oY,5m%hb,mT m<"@,Ux =?+" ,,|mz NsBVfXڲ2G%"fj=@N`.H:,dwp0  GF0tKSDԔ+pTI tϏI'.؆T$\ ߕgǝ&. (.8K}9x2hnr5_Mɳv mJzb3+l͡ c?P,k\IA●j-JxGnG?VR `Eajر{)܇A Xkxynl?_k뭟 Yfо*z[zʔ _F;pW4xhڰ{4%5$R=D9O2L! >?oО\kc[@nC?*fǒḇ(5F$UƠȘ2J%NqLuC\}(M=EKϷ?-qϣ5ݯoׯK8hb0)ו #1c h-U1j$!F8F@^>&bL! M)b#ǵyxGAc$LOXy0L6 GB;ݮ8tFuυ:忦qҍQ2[5_A,guq40>~W!5eHibnA YM$&F: Ykhof4qY/sd2.|bTqرJmWt²XSKb!Dbvw;Ȯhj")9z4_o ?r1pJmP0e.ªێ2ֱJ^c[poBpIEWai,6KҲ~c|lm#f{8f?=YE-wbey WIuیM\N.XI*gegſq޳0.֦ػ6p h$ :QGvTwNi4( W9)cԲHo V;g$(bJM1Gmɔf.$&dgS1r^* ˾I唄[cg`Nlr`U4MHS֙Q9459mv(,ØQpKx)0gtoWıT؜1h|*CM=%E@=[zS%{; 6r>W)l[*J#&^[ [>)bFU|ɇ7y6K^R!V{}l c-{7翷15N]|m{B{eŶ#GBZ.ҺRntxENgAl+ {=`PĿe7 jlwBA=8 ?^81 tnAѱ+ƃ7`e%:MJt&!ѾWU]W֊7<~>C'%.O\! &$ W}ɛ QtJFjGAx`ׁ;/?O6W>),w6/'q~qjErg ;ӏ;~@:ޛ͏s1˩hXb( 9nyc`!K(Dᆀ E Tl9'fJOasg1&d&:)N  NH9KfsK{n_a3˟/+2z|rPSVmaNY,[ȗ =e ]VhGCVyNǎ[R [LtTw'&g=Gl0}b9*JtH8JKmT{1߰¬68*?]|/ ".2d,$"f݈PM־!!d<?/JǞ~&S>-~E,ɳd&yVy#B`RfT AIo*8U=B58]{ c%yBbZk)aeJZi22T/;`.?pώg{ےzNn=6N:|J2Q'Jf!6ĪBl@`|2bm0 VLBW& xPġ|>+چHNQVýeHX ~% 0V"z/|P(3 S9_u}w9#D)Z֊u49e%4I\TrTQAv(͟5GQ*G|N *\i $)eZ "CBe͆s7# ^{J&c:ilVJV،92*B) R7EDm=6{ԽM6Al++UTChuͻ=ݡEvc^ >:|$@{нgiztU b07}u 82X`QuʃCk,Q 0#*$dEN6Td6kgm4Jd Nl(Js uHч$@Kw պQq>h$,tPIS :+ *Q|D '*Ӷ9% Ұ3w2f$TSeO-` DƊl8w+j>> K?~[zin`<KqOeUޡCH-y2Z',(Zde3 cʒd%ƶR AT=y*Z*6=9RqAڇUDTkl8w[vX-l&{pmn wչ_Tdyypb" Ao~??LObge4- "hr 1rBfHkcbDkeE[S[]Cu|RBfSKTط&)1ɶh5-v҆y(VtVjw{+V E,F9dHk@E.of>{3_>?|D䅡VL%eZ|,$F^a1& % Z[sVUvB^ݫ{uvٽ:Wg ٽ:Wg=pwvTg^ݫ{uvٽ:}P+:WY^ݫ{uvٽ:W & ٽ:^ݫ{uvٽ:Wg^ݫWuvٽ:Wg^ݫ{uvn;Wg^ݫ[ٽ:Wgv{uvٽ:Wg^O& {6*lj@e'DbY ta+X &t H@IP$!u Q. %OE6 %9&р)ج5+ "II!0btRf!щm!3Q v#^_NJZ Sz^Sq׮Z^qqA][5~~r2 \SzyꢸNΩRgHT s9 |*ҳU%yo|BIHcT]si$.w=;.ޱq5KCEQHA!ȟV.ܖN͋5)9K 5*k]Ђl2YcB_5Ύz6էgfɷ*~]i(|NQ/ZSb ˻X֓]Z"+C~d7cC %,m1N"ɖ&'օJJJAYiӡV3ϧO~Zy,^!DGv$S")Vّ%[H`cF6㠨xӧ]CNQSSq"d!!XtQf/ϨkSb׶vYF|X.iwPIqVpʅ4@2Ha$+&c v1z k8|02\<H<˄l8&VeVʫa6J@11dp6 0(G yĞڅ+zۅm^ʢ4JSV%JB}fRve;et(%%YKTbߩd\X[XڤUGYPK.TɀFT QJV{\VI$TT:<'"U}b~1Fm႕l|dex6uՊ^ [1?c^ /]^,6E֑zK؄dGN⻴lRO'OfZy, Y |TE˞}<=+!_OZta#ŷCxH"Q ǻ04y?&GNg˪*&[X5_{45 ,c1a)ǸWDz2kciڮ.NiQ^O.+m$IO3K0XxE{{^<%NKݑuTRW-YGD~8%e/@[1 1LH۬qR$fƅϊn~56c7 "v"Ɍ6T@"U#ҵ-:3{d63$\yjg0 5:q! 2 `u.uHLj8/M=k͖4m[SE5i[f~=KkVnJWCm}h\b|mgT˚'k@naus c0 L/R#rLU>cˆ=5q-NhOɢZ OfYLi8p!l$9I2#,YJ $^8x8={轸h{wDae#XD N@J<0@J2mai`NbNYL{xbT}^_An_ƜP!Ko~Pres]ypN9bR%A҂"4ꛨ W_}+1y^8f<3Ԥk-33D[P- 6U!xbP6+N ئ68b+tEGjd~rVk/^s%c7y~\<Q,΃;CT~,|f5eDEK/7J?z8QEȥ;.%QP% P1!IeR$]Z^G27f?whAVz~8@],Dn2iZp܌&/ vN&DiσU-:pOf-Թ$SE=Bx?M3fA[h[^JHD~^m2z6^d(zL^PҾm?ecR~s ,ԹxNCoc Opb~{0 I,W/w}zV|\~^AErV}6*m9 C( ydzEXfBzo&5Q06Xvfi{_EO+45M&W=2ſxNuo˻h#c~wL @ՑIhUG\[gd9H p;ݯu$56mHGע3hiΆYM8H_o{F]-} ߙMdz6xR_"wu:slJQpOU~ؖڴM[̕˟x .NҥVpb#VUXasf &uD%4{Ee2aL>ke+%vY H(򥖜N)bxЄG3-_*_)2ˌC)MH9H2e(h{fTqC]tG5غcj%EPl]ؑl]ђUFh vy"WSxJG*;#$)EysWMټ>i7*o#SÀPO\HPUS9Iڊ `$2؛0d߮QWSAu ҬC[ :b2_ǼQRή6N4=z`3hz*%07dAc) nx=c܁QcgE;nB5[A*H*$ Sf g ~2TSgWȇ@ {_TQJ#%Zfu^ (W$6`0A2 ǨU1kQ$B>%}t=FeFS\HހRأ{L1Ts[r6(#i16Y !EB6U7F*mQoYzr%iC >j;dz0cWḰP>C qt@؀au3@)Oʆb Xe;qS󮮧Dr XXR7Š]J:C#rڥ >T(ֿ \IAߵ>4|w D~Ӓg$_taoyi [V[rA=퀴^ iQi-O6 3 rTHCKK)FDOAs?瀭4]Yf=GO  A qBZʲr@-HrAN+A :f -ki2(<ШWZ[oܲK;ȧ Y0)CFDn Bz/0"}@:H`F̛yxwNT#^M^*7k(vMقo)Go ~~>=E]Yn7oP3`K6A0ɵ r B($<3cLQ1Tk%i-9-{f3Bû$Afz^=Ul ?@Wz*ߔ}sw 6כp3W|&E5bBQBK&P(LvRd}P Hy' |8G:+)Țbjb wܨ1j 3ꄷso-쓢욱zTWRadʙ ~ϝI}pPlއKao'#oDa `~07j8JHJJSx~qW0KL*75tT h3w:0jYe)dB,jUԾ~!IIHGYt%+%+L.NuIRMO[|:uQ )x\$"Q¬aBFCgDqk59ws]]5Ԫ[Q (^KU"FXUS#k&0k+s\R[i-um̏7͸/Okﯺ;7_ 6 ` Wד?l&!HQIoP(F/pPy~٠rK咗DY$tTdC"R1H Q$% ,Byz(IyiʩG@i=5z?`$>nOU;xӏYWC۬_ښ^)@%z! Q*`:,(&ےQDZ"hxQ<]^3hMDH)>2 PaT2B.GA`Փm4XSzI{I z8.xRO А^deD*I!jNxLrX!YitR3a }+w=t*z@Qז8=O .x/ I˯\nàƏW4lֹ8x=@v^~MW51&m^[*74 "UQ2Lw37O_޲X?f^Z7ZܶVnWDXBVbKKz~mTdv]tg\lM> ٴ ;ҽd,2т7ɡ{(liZ9Tas/FbBXSK^ԞjnBfخƋ9@[GX-gmڹk^耾+&lU~=-ןæ6eZ""X!wjxlU&PӺƬѱz̷ quMh_@MzxIsbA3?rR͜2Rb1Z2&ps6a\}1ވy\lvL#G#CڃLkksJ/>. 'i=^-NY0{wfֆn@ ϝ?M `V u>G(;=R__|PGd\EmO'O(DBe(j!d!JeҚm)9Pmʠ:Qz\7C3^XvgqDZ-vl&=Fzh>ƁrgǸɺg(XICI%j%ȋV5VٳB/`l~/r-* 8 ^CJ <6җ*65D'%Z~DKݷVL'}0aʡJ"KpkcRS)2S6:A6 Y1֮%MId -2)05l MN(:T1팜C['ޜo6Pmg>w_?m˴ԼxH{=FpA ZUӐ]3/W^)5NozEm-hVM`&LȞ7Uw=UItd'+YPb-RT.+)kiqK0!W;T_{m{X*e2+>eA -!S魱(kk[X?kE*G2G.\JZ킑cFu_0C"7m*qv`~!Y}pF …\W:IU&267ABlD #Ig}ؘ>/M_ DZ38x+N2"M`tUKkR`#/Ř䢉H+Q$g/TcNpɡi- 5$ |b5( tuyVɜNw|| S'o:fy7Ϸ\?yGyiT}}_+˘_zћ|AP>%hT*'' I\dQ*Pr6f EzR%"Ar;U2L9Z Pb`.263X5z'Ii/좣nFQ"8!Iz풲NUP ,w 59om"1@r9Y+W}bQka1Č4D$4FTla~O"8G ]׊=ZUDɩB2Y|$lBJKIyUKjup9N oөsP'I hPD-^%0kz hSR2a2(nGҨ:㉓Ҩz??jς#<"T,dUPhaD lT{-Ff<"T+`QSy}XT3 &fсiCѤPUxv2IaQ'aCY ѝxlX}8TKzb\=|i!ۊHO^ JiD E$ &_(YiɳT;@M.6Q8Y$:1CJgOI|(7*l % 2EJǗBB" ŔOw[\؄`2H$Ӑ|@> ]%uר+rbj>X7{=س-wo9Q¢S鋃Z'[3Ph#Γ|  s ⣘_6ޱ̓q9@Q9K.S9FW:ޱA,*NSzAܱ2TM i{[AXP$9>@6ѺK{*d]!Sml5a)RGALv"eh !J)!:!̑"BTJ|AVNaNX^C!e8fI5 0h_>?é^?}`Úz7k۳/9_ntQMJ_Odq_F@Ǜ> y5.?U ٘%~H4S),o(NT)fzRiQǦjcSK~ k˦/xBaƀ>1ݎ'yQN]h/m4Bsb=m-DT&"p94.kw #Ս`4og۪^oobAŒϔohw줨YPkfQF$M2#ɵ,L[rG߮mBe^mesS2[>7v~o?./ddQQknvݩlˡ+?xة{Խ^ PMvGGYq~3yg}1p-S騹ޔxhr뎂'pFx/ͷO^'fl6YBF ":>J#& DFyNetK@E{#xL%x;(y)FkH (4#pt"j0S;q*3J"Nq{䏴n̿̍y,tߘo6es_1>zƘeE#4!zh6ښD>6 Q 9Cǂ-f`vO?Һѧ"~cbY[0ؼgȉu((.S8d7m| ) " Z)lm&I(%|6Nꌜ@SH=*%: |_Rp]m!%_Kg%lY~pQDG= }ywMABe SM jH\cy*vcYuHПɓh@QcZ""rXvQ)=$ۜMWKn5EyᓗhRƘRZTE\P%ᘭU:Fŵ: Q3 " QHA` WK܎θ֏ w!.qOko-),-|w{S~ftGO+N8([ޫ'2WHiw|ً R;FneN|4rW,|œ{xI~wNԼ[Oה-m+;pK!:_nخ5 dxʿvJGvƮj]JkHHns2̾L׫՛o9m5ShlZ i;_J\0L6oٺ9R~̻< [n<_gߣѨ7e܉rKw X_EwB.ɿ@ޅu~Ӷ)mmlrېzVwu#KD!􏴗]!+JI}~SYCXv{Lѧq!~Zy}?N羚v<5 Z(G1ym!a$Dd:]X :!BIk]^|Z }3t|\6W.RoCEgy(~5Isûhӟfsǟ}{g_{_K>Fmm 2fX Z~Mz2f~:kYۻ&%މL8-i{ ?@Wz(RT]>e(5 0]hthD0jhJ!e0ݞj}wTi~ J[ݦo7Cqhbu\ [K]ۏ4Nmڶ(@-g09eLͪ}F&=r%9rq>p ڿW|2rg&@;P0zLaȑ0=>'5iĚwoQ~LJ*C|IQ!!JMEޮ ݧ>Ḏ!`o o]#vo tbzpڕ=irސC9}f"$h Cf];#*;=:.K QZMYBtNG"ӂ$y xBl= \DI9钵E h| nԘSU#͚#㛮qʦv䓛q,8c:g6sVwDJK0+|7} G< ]ʂvP&ʻ@ooAwb4b-/1xԩ+(v#z}! Ev6 Dgm%v]я&޵uHWM$n!uǶίb _Jx0"8@W>F'gY.x:|E%?;~v׋9IlFtJe#ZT  U)"y;|eNԽmV,WlCYW34psvqwh`7X_bĴV3SEp_tq|39]\5@B QO\|xQ@!JҨPouʃCu>/Q''V$)"' 2D{IT2~?XW"cuڜ}FX8t9B>'TElP%x`lPS7ΎM!y#GTǃ'BHҕ h$\pdH壶B*Hm(|FaiXY~e}WQƌ}"fjA`XQugQ̰s8U4gU;'`m`rѶNOfK^4cEijwEb=O R@h y.{6ǞQ`LTd%VI1dKJ%A"@Tlzr.j *Z`Q/F_Hͺ2*la3x-|C>wx;»ձ]xMڵycŶvA|8]q) ̭V v`I30SF JQ)Ĉ- 4T瓪5Qj6h0pwrv\Ql2ЙsR뭎VndPCͤP`7IbR$`A7Bp1"X*rZ z #bAjj+1l5#C̢#d˶&eʊa6A=Mܛug=I2j5bs-"40X"nx>z,EH*I.215H! ȵv:B̍Ff5T;ufVMi!S̤ZVbT!.ج;;d>YPCu6E.`ѣ%K:d6,IRbm *H@ !J-C>;[IsilLXw{ng_#zb3;aDNۃ׉~ݧ |!x\ѕgyQ O>g?,2eL>i=X%&7^;??`-x=ƨ:"!ē\5Ewu1X.F <ǿL&W__{?7\='Ovɿ*CT?k-WJ͝j YzH T36%@{7E^2E]}kHQCh>HQ2ր.K,̗+JWO 6@tFŵuj+]`:&1$L\”YKaLT i48GJj+~gmŵ,.!7[~+w:B5ݬJTw\LCl ?@Wz42V, Cy-ˮ5-עfUСPw;)B}wrU@HnaWIZn5~y9]DA{a; gBZB:π:FTTbvu+.&,P4J8UT|C'zkK8ImEY2VUlQӧz&H%n3 LW7f~1{+ O_t0eAHl@1Nfg"j`ʠoy2G؉rS~,h]YpYt K>D(ML! cFˬ|Uo+OtHuyZ:p|,gWMGemvWۡ)415zqQ![f\~270$8ke6Jrb܊aJS9 H1{A$~*qΠ8G< QˤF 4L0/W dztܾbD/:6Ų\vb,ɲHs G`P2͘5tDF97c5`P~%}3(9Ǡ<$Qkt8rUjs@)6d\4.Gl~tFECLAqR!8rYJf.s>V`lիsASe#뵾(}Sg\ZNs+cQ3Åf+2iW Oڍu9! . gO < $$سSʂZxm?k}W'G6D%5E}/ڪHΓ26LpI{B9`L2QdX7@C`0}xʒ R=DmSvAX U9gRR5c5rkzX^ӼZїv|Ѕ'Յwz. 2T`fN۟E9<ݭ㛿|56 # H&4#*j @Y;F%L0LpTBLBl5:%Ȭh(vEmJcQȱ%ɶc 3*kja:I&f_v5f$؁$ Y:Ȓe,)g (%jCZX>ХW&dYQ%]E#P@4IG X 3Z_N9akԯs58V#QqЈ;9|i^8$gy"KHBɌ2dX]@Ġz;us@U GH/ΛN%E]Y/A/zqwʌhe@$8$v.K<Bh%s 2e "O{CSY TXś\~lȽq e)rBk,^>BC*sZd{0 G۔:ҍJ68@b /dm)DIwQMŚ[NY% 8͵Q i "eA w,V[/b[`r?һv6OEl \kLdP&-MŰ`*Jy}F1t8G@ XE縸MZ<ѷ:_:ۆ뙌.[z1,ddi2$9Idӛ/܄gF~4O+V{ʺϾ;_#`HT|(A.U MQdh~LڎۍRUZ[ւ= FK{eUlUZ;Z:Xo׳|)oF_GK=qMJlQ&nmgoFKXTR&7$Q\b!ǃ >c5V\?; ,{kp}  WQ&u"<)LS>[~d)CD͘uȎaKN8kD_Tpǒ*ҦIB| =RҟR0v/yiQ{8:X1ZL6E U ,xpN%M,fYӤ&FDzZxeQ:eֹ*@gH,BOXc̢v|-r{Pk>eYuI;̉3o4xq<%j{<~ޑ8aXOW0P_ m:姟~-- ^jwO 6:z{xҕ~4!HZxt XPV Qsh~"u~2*m1*ҙG!dPy2, 8$SK@N6: % ЯmXs̷7JYֈᴻ/?:^8qʋs{^сq+5i6V4UjJwbHO^KݓFF%LD"9IGΘIY/PK nQ.1Q_+X"s^Kf,=1%DQhcU4.Β4d3|<r6m-sJ2ѯ-$-t'0 !|^W@{| (`q]/8/W 'F˭1go\ ݲY If+2+*Z"ێb?te}Y34D/~CȆPǏt7^Ųk n-vD?`*_e&򓚉7>Wfne-UnUցSK\DwQk|1kvO]f H&c|Q4f9zIo߹턫[v>y= wfތdV(^!<P];MgqƺU :ﭺ j4-mƅ޶gɘn0XM,/WvmV:g[g_})/}N^b-hQQ }#Ybrp'q6hhaP Ϡ 1-fJf/zIcXБp0r nU&2RR wZ)4 F{Ar \+LQv[5r+IHOiGl+<қ9!xq'xz^{|VwЈwe;5ٲM*>'K׈!9`U)lFL'S6q,sJ;3(:g:[\|lU=ymː %$cZ]XrWVYGI'4jb0h*u/ ߟo^* *1q̚K=J5Z, - ༒2ypt$]M[bWc}-zZlCG P> նa?Éj("ݠS]+W)+ D`V0Ԣ\{MC3;_u?*9A,zC=};`[dǠIl}ѦKmG$:_zVBL'8Fن}F!,YUA?(AEKLYSk ¿>X>Urm4ok YU@_,B![]ԇ]贶3NglXGoJte}[Io&&WDc\o٥'n#wKSwP(Қ(u1mk­-WH?)s:$u"rGD]*Xvn*do13T6^O1FE=j!RoM y󈹱ސ'7[&d8(B/GR]9&ԃa'lY"'^fMur.e|ev9 {v2ϝ|&]P1R/3jVS|5uXl̅>r۹QgupۗᒑѡɈ,$Gd3&8ZF˨⒞,F؛JP.ѿ.[BKudXB;%:1Q0xԖ:JAUQzc>ozmoy3Fc+e|P*sP9cRIƪʞx}^֒}6OF?i%dDh8ᐝ"*'J7\-;QQOŘ@dD!3P ,hCܻBi&I@$T׮w 8@!pD.r@g2Br%W:xI] \V#V/ɴ?y m^܍w ͖wv5𯋗cr0+3]xU V5r{O%kRPus2xkJjiH7NI@^0Y! d?@(ZN a/h+hpYgc,$8/qbٻ6WzaY`vm0`%(R/KX"Wb%Q$]Ed((֪cՈuՈF/C]EJRٵXJﴶ:9? jY*֙1k uM-eD7MQ;r7o9rG@t箂r yK7p-*IȜsO3Cfl2>X 4cs,'S[{H]#J BJZcVFdickQAJ!uş #&OmSVPAS h|"GA^q݆s?l6`H&2e}ײfXh,7ʕ|jl۰CnqGvK_//ͅv 蓶rz(RI!*U3Iښ=gJCTMvhU G={ D+"`[cQ"RZ`ORX4`U٢E&hP &*AY/]P*uΎzԫ߿5ȗCP[ѐϠK;9)sJs ,'%3^9#`3.V_E˪C&zh fb |T*kN?{o9\DFhk}Fх FUkѐieSjuZi19#O(ao^AIGRuT2J2k+eZĘQ9>㠶QuQ]d?|z#<"&c0ٶ\!oZ7V*(uJdyv&@$O&'HGReh`(_7W\gYA=u> 'kR*4S*ңu'cMPKT:G?؀MCЎnvE'y[XOS)Hw.ݚ-"!lH&d'[CwE2-׮oe(GN˦h[Ga*eYR!fj}E "/KbY67m9dk BŠ :N($L4(B5 @oA8}>.m .6"1ئu>DCn =f̩ҵu2!6U6YsGi:JZvjUU(VKQ)sdl NbKe=1!xT/嘝WQn0\ -6 Z7^yA)1g 0H}~ƹ~`kYh=QMpDQz;AѯOmWa4/:/s6ES6\^b֮~1y;wKŪDX4~uш T.Èyo0.+L_LEL@a:OR|AqۣͣJ$;-^V.(b#lN[^rxWnŕ^/b4;J-gE軺ENIO9~ūȝ*xqsFm.HiO4 j>.eja'Kj|^')oI.YDQ[dI)AEeAʷ׋2m˳oy[69)ri8ˌs6O./X]gdkҴtH n&u|is\)PMMK{D{{O`5}x%k}|iRztj\%4.uO>exux4Dl ނҌrJhX;J1h-R!Jh2Ys78m]P:`d:J6jP\AX1N&U;;2κWqSx'1b8sRvծkbR.7?EY!0BچX(DX $]+佢;{熟x^I-0Yd?w |v+ } v"-Gt8 U۠#+CDk!8lta4%on`ۇ]a~CU dUے?OvzX =>`;TOFūяg<8sLf@Gke?K nSvїv^7~ٔ/zN#'51 `q7bdk[N ICP)ơb* Ȱ+!.l^7>6:_ n[_l,_\gK鯸ZAv($k mfr| %=eɍFt<2WvN{3mWstv>*ic-_@ [R<^\l{ř{p&O:wL6u%쳚{<WkΉ;ң!NDa/87>,ta:G_(r! d2@ Ai"+/`B-&p@eJ!!o.iZRYRDjp U!b| ZXsTdtLjWŌP-q[ ӻROܯ⬯@fEݖý97ݹi#[Nw#|$f)gc7VAY:"BRX-fe BYŎu:׫>.s,De!.ёɑܲl   Ei.Ycdr5ʱ?'K1 JEש/_JNI_'["B,xU y:q%T LZG좀L|{>^ѿWwmHt@~7t'^{Ձket )X!r4jQlhp } [[ǩ"T9"*y! :T,dUϺ#6:N6pɪ:֏PA Ճ$:Q&Fj:ãnùݴBz?p|a.y4qǞWyCW]Wi<!J'OOJMZt_E4Z֋Xc3Z0{>_EjwvfJ$zՊXL&ObBa1,(G ɭeR]-'9֔ѱ,FLOI2Dh2 *oYk\H݆seU5%{«9=la-nFԞm!㧷iupvyZoNx2OWnVp]6""I(Md'xM>A*1%G NKF8ec44i=&Q&djd@5RPGolPCݤP9Z#K(v*[tL0Y2#АKrJE-7)RY6#ꁺ9EF( Ya[JC p, s OSAͽpKR?{Wƹ4x(|kmghhxb." *M5k[WjWu&!+15dM)z;ie|=x2yK?Vɇ[Tۜn8i2o-RggxϭY >R^1q,)SU bj`<0|>F3pyQ3M,by"ɼPEI'LK*$yz{NM L2 ֎Np1BRo!Fn-R: TկfbZlm> Խ-@/3O_4|3g1>nZ4 Ќdw»\$LJh,Gځ&e PТWhe"Eƨ)MzCR:4'Pg9$E* $B8.^N<94uʛi,.6L76[}̍ Ѥԝd\\@`. Y3ܨ5e?҃҄.-p6[|U},ԀV`E$xq@\ 5a:yZX Ì:\n\&a CXuۓV۵ߴFvS{!!zfy=/ q c)RE(R*MelB(y\|Mj<:p D34[TtGy ~&>.~mq0p(GgW?ջ8HFY-A=-Cz:-Tź&6cڥqr|6Լ 旺xݨxjn\b_7n$ᛥ#n7V(p`C[ :黿~ q'ꯓ_s˪UPxɢ%G KJ8ՁR ]f>]Lt}f߾w~=7}0Afc$vׯż79crN[^n^-UΟjޠbO߽¶.~2,s? 4F Jq!nidҏl4>?ZA-,?/P]B| M\kv5jo:XԺ|rvEB/r<Û%fcokQytWoVSRr>m #!\_d4:ټMoz$+?3 PM+}ʳnsu%eZ߷1rT[=B' zB3Fn>;Twyyýi:]s6+K۰@ d8&nG팙}U & >{zuƝ~k~ D a׃v:{46ՙ qfzYlT&QڣFej/A|D&9ϴI(TF*PAc6R{h8)PKuQrp.x]_lV}2nl֩jksى6}(PW[4o?#xUq Jen zBs1w9 Zب"F.ļs8-s}R +/4A:R2A}EVd7Ev{ Ev'XWdPL0hȁ9I&ĢёQ#9'jKm=RhyHg]&)-~I@:FIPRd9YȄ'hDVʢ#BGFj V{J_*-DAEQlb4nr`}=-*7~^7)<$a=5If6EW9lRl:)J{˫o7G䧠>DRy$d1/èΗl-.ݛ+ ƭ z\oS9>jyS 6mSmt {jP|UpeHw9 V֜ÃΧ TT??"a̋VNyoD(/"P xQ(m`h="zi LybI"J(B_ Li)w%;6;I΅g§˅ 2{Žlgg|7>ɫ!-4x/ -Z:&FBN:~X+i6^ "h-+#˘f ,E D /Ҝ ?´7p621SX(_F !ߐHe̼͗v+.M1!ןtpC3*8M먜P˛'Bq5VUD e&c; @x)x&jqAC=((wjF !*Y",(e{; QJ|t+ ClQ; Vr$9g ҄%,3%\!}2c#6Jy]t8[YMW>28nA\ hi+y ,;81jqɤO*![R0H53N[7@$˨OJDR)&^ILRDI(i(I*=3,W՛Nw~_T( 2$ɹդnH*fQsi孠 2q8*7)'V} |GDup޲sbU.;j4 [ϒY?䚡Pj_7ٕd%QlAi.BYHJ$-x=Dq% :@J+tv/JgP_tYeu}"RCV%>|u\1JV$rlT0D?=MHI&w!E =ѮtDRXJ,gV-r@ a!rrXv@I^%(A  x"ɀfA;a. {b4_TaRk(c$rqqs.AE`jd>jKS$Ҿ2f޸X{===ܰ-ftK:r y.$- [:s<*ȉnRD@5Pp/R=m4w$W10%s`mA'mp"Łh!ECBe-91{}Fhb>Vr[L*9m?■=M_/t.(Ǫ3rp甇WUpem~F@BMy V*/Q_FMfp x: 'խ`wU.Ύ5M?3sU_zѾ>ѭ(WbӪᏗ_MD+qoα:{PM Qfݦ坹Y/"D^GYՀoΕ2ހ0=5]͑l\ƺ{'q$tPaGƢU^qAӌ47QI"9@2cfQ3΃zz~'0{uоgx3%4v`(eFIKH-#N7ڋ;u"zÉgqS?ÉӜsy)?zr݆RE۹vGD*3h+=HМo ض\;KC뾪T@qxB(a1ѼG҄1' r Gc4Th ÿ6*b"dTSZWd=7p(K~!Q8K-&sEӵ8:{ e7k{;V*GzO-uQhEN q}zA+nzϪ]yS%;"ȥڳ٥j yhZM} knb/uG7n߿=}}FLqmo Z|0{kyu͍W+q/hya2_;V/ beg?9m^=挶MK\}7tu޼cf g*|sPoyC2mZ=Ԉ^ lt|ːmz`?*H ^7U 9Vy>ȼ&"sD$ i=-,Omty;x\b'͋Nzٽe䢐D˚. gd9JWFkt;epRƜ.+*2Yrk%Fԙf^H]es{\2'!7xj޾Σ6N-|B' GMνkgc7QG _e{M}sCظ >]ĥJa™ܰ:=<=3V9~_<3ۧXS1e~_p|jpk*5[up WeO_%[^K1_cmqٹ-n3Z~cGB}qr6IsSLPrB*DI+0ɨSpԇ$:?r >uw!y^iQ'.2 niBG<z̓>>,_$d֖-8WH/o kRkEaK\,n􌬸䉳8޸@CVR2#30KZdI7z&N=d9"&HNT(DdV\ɨgLPZXSiNgMy1fAl}]h%}56.h|jE N*믗0/IfzyQ~fŧ_=j/VI>m\|ͱYDOݚÛ̷<ۇ(x肏] {ΟZ7|'W>)Kg7fpLlTxdy5#K6Ӗ,V;]ә]jךW@ӆj*Y H6m#2ome6F^cH+ӣ@ ޱ\6o|r83fqFry[G!D5ݢSzAijٗh.'.$cOP i~mPu$Ģ`lnqFcS?3Kv\ㅺSۤQ|X\qFI[d<o?>~қoˋVk7>zj({2оEM(|[(T2I]vz4u7RVw3IЭb.=sH]SJvo|4i_ø:< {/&'ɠɸR0 _ss>s(gAH_}D4=קpJsh>C>J~i_NO'fn|!-i{CZr_B0EX퀲 h΄ԡǧQ|d ߄6TMnr'WBq/( kQ4܂VMKg.e7z~0yfrz9wؿ?|IJHwEJ^ ̋V2$$OiBVb'[)Κ`zpWRgcJwvn2&^ukm KަWPHZ>^+Hj(a^c6-c>NŁA0 j?tuxy<= sUcF,}#bR([2V&xHK2RUsQ644-ݼى55׾-i'p ~뵯1U-(_עxs( &Bݳ&RSAqϔ$`ērVi6\zM&@fɒ R2|[%^2>$b2[ĩ}U%l0)vwåvJKr&,>NiѵV:'C9.G5ҡK(S@Hgb3CI!2W2IlȩgƊsP- u)K7d0!46cCE)),[WbiA4CkYR'W@ 17xw J$$9` BP0ƣp{6ČƬRҧzDW'ZERZƝq% 7z *khɛ|q~c|ef57uqXI6rQg'D2>k":"7,BZ$}.F);2\I6k.{Q(-CG! Vx'ռ"Mr3\%,f0hh>u ],gQK@}I%^,L&.7WhLtKXH;SX211% (x=0F3`-S !uWN܍8>>@Vuʃ*t]YDc!ژ[IcZеِIX'C$gH!Z:.l#FT`$b5),SN_)X~9ED2{zP@NJz^9Ŋ1v va5} [O[ժXTLfrYqb! ga٠:"Lum5hkG"pq} Ǟri `Ё'oTzCes Cq2F0u[ea>& 9h&P PD* y6  h# naeKh2誃C$:Sd)i% na@z$]@8@{Sa*3v\9U BPC,1p :^? =Jbc4]JʺN239& Nx0 է+)m$iNK_>m&1~54H~a=y5# Iv9A2r"fi ^h8(Y,U "۝@VAr|7Kp_Ƞ7X<֮b_"& 9ih>#BT6:Iڡ;IB1!}3̪s;/^'Jx]>u"A$}@BObvuPKpqndQu0AؕG U20)(y|d!'Zˌj,JePk҈H)DN+WYi2a(Ӆ67xOE#"Ar%!YZ{[#fHpt_]g,:7PJT٫wVTܶ,9e58x~cU0Iv?(w_b:EjC$*rPw@r! `"bm.oTE{(%Gc,XC;f KNpZd,j|ӣ B2Б%upqAc7FuF #Wg<Uyd/@JY2H!(Ɔr"MPk.%[C Lk4B:ipV^IxޢA(5^ZzVDEKU[i ZVH"tAV /8ZR0 z0dK-RKuhuy^mg0:'mi=*us;L%hD0up{H7[c'44z:63]7Qdfj[q[cV5 U#(pKDm1&8a·Uf>P1nD! XCCrh[hdsCZKw[N)r]gd*DeP˄Ԡ!!K[=x"2P}!1+^FzT [] [WH'M2PziP'W WA)A^2Y0jaob'JQYQcMM:t ?9z#"Z =J6"\ 1uBͩ XN댞ٯ E5jփ*U'(%<;d4 b2 D 9ΧCpStrDw W>(_;҃7[E0%ԾH؝4* X[T.J(.(-l4EQ!VFEυ4AS"p#&DG rs6Ym7SֈU @&}N1 0&J9c'T$ 1D %;)ƕ[`4b(m*?u;COW4<)s)ttMTQn9Ջ&B?gi6; @IUmFF"6QEt2n{Ԝ-GzrDS/|{O-l0]ٟ/,gùNHv '-ϷdV0r{_:p5̀yJ rƧ\\ pkx+gg+b+b+b+b+b+b+b+b+b+b+b+_+O vps{2ooV2p5W9W \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pu'\]x2S` Wdr \}AtW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pսW aY<% qO Nݓ`ӏBW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pըUA݌X^5~Ҍ^b Ò[םk-dr6PPhksgo\n3$vp_q4Կym۟=larO9ůp=vʯCԎ }nHˎ>0 W3?mh I=:u1)T[c2]BƠIi?:9Fǟhh$#RaVi?.'CGF:-2PwhC<-fn>òו-b tJ3[BrΠ<٩%ySAg$J:Bdͳul9˒ܡ/Gmjgx̃ߑ*sybeb;G?6_ԭb`'_ Z"K&Er.Ζuǎj-.Y !i}W˃_AA5+ٙP;m]fJmщ% kѩuieAx~8b͜ C $F<;)-i*5 Sd]sZhT=`z|Gg9u;oih6?/ :?^8r}^=97+*;ʿJ HziS *D.I-EVЁ +RhхjX#y%:Wd2~;KX~9wmtت>aؿ,ͲV_rsvN;zp=ڞ;:` 9W̭L)n$WߡJPc$zjSk"ҷǾde?6wn3zvkǗn9sݣ[LwiZ't2Kwx}#jD^"7.=}CUc`u;u>Oh\R]*`=ixT7|qb|Ͱ6:mαi.7y)_/t C]v;F]zZTjlߵ} =?g5VDJh$ט tb/4zq||DO5'WޢRrkڒmv#v$AkOHiNߣҺ(;mߝ4ت5qQIvR(O5b׈v'/1]N8np yH #ys6ېQZף(2Wk=̛7^QGXlѣ̹y1x͠ ?PwnF v~J⬃bǿ2왫رdjhN eeT*ﲎIbu)$DP@byKpy5n _8TX\Rʛo^1v}1Ue[}.-nMо)o7#:j4}km)"ƨBUXO~@yKz cɱvk֖27,Wt1&Yu Ŕ>D!;CNǒKq_0J~U~tҵӀ8ׇH:1?mv,FyT'Gz'ؼ4;y[b=M#;wt^+ݚN֤^ Yi-R^t)Y8}ɽ7A@yF4,L|xj^/&PuM߼&KuO3<?|[?BK瓿\myaDŋ[U9CƮ,P=ʰ&,CK抮PM>~ ʪ7kݛM[/\WNr*[cT6ǜR|E_(pVWݠhpEgPkAufg~u\c ]PI`eCtrAUGa_h ]ESXa]:׻`ֹ$Ck~Ek-lih].@٘bOk2-u5+ha'hI?]13kQ2XbWb5)*1&о`S>:E:NڡI5A=HԶ3v1!-7/vې27ŵ69ʶC㹑37R65Nw잦#ɖ<z(BrB8u ekTUw}gE/,RսE:bro ؜1f7>>,O&F\b[v; ]5K/ΏlAѿݻnުsvYM.kt*g8BOw8#f.~TK\S$á(W=|2=({˒fOLUW_E \Sݘ= !-'nLE$β(#Px 5SHm[Ad; F( }ju}fݾIb==|.iKy= pQF-fZϤ]g7aF0;<b|31w۳dd S(ޕk?;R8ͪ9j+gPW}<~>5k FVoml lΏ\{): #8汘g?*f?~-q] >[L1͘U}ӹ}A1V.0u8O`꠱ IXu{;z*Ψ^ʦ)Zݧ)^<)B£kJs̱'$XNKUhrij)K.`!j "yQ9+oPLFmpv[Jk58a lាp92uAjcl o8? T e_OpH) os ,&ML&8B*!F &!u:%Ȭ6g'dBb6Ѩ@ñ%ɷc 3Άhk8-v ӊy,V58-:!m/HgI&t֑'XR*APKԌNCVa} @!yȄ "/:Jdkh #~Jm\6q ⤟cǡQleg;UUΌCNBJ[XO3/qPJ 5V Y1h%v8c*Q+]6!&Ҽ\L  I51'-rKJ춈/u XWs:XgkRr]-E.nhhe@Ri;J. 2r.A,Acţw%CӲ=4G0ad}75rg4Eldjj{$u$9?W=ՊL樥$Lm,JI%+pLLUN1TqXyau\h~ھ8Ixs)NK: Jxgd4JG)5<6Bwl2:.ēVz5g1IgmdJsBt'֢]Wvپ_eiR ]Kro*S::~-rh.@>nְܧ@/p3>zeGp;F_19ph\].II!=Ss0(r r)@ZY민>{ʲZ Qr eĒ6(%<':EjW 6,e$W'\Faȑ7K/Zl3At >]PZRSgbKYFJFdwG_R '>>ュK?K8Uݿ mؤ/Ln'7l ]꾐7}&BqvR?sk\US=4Cӫ,*&-YmY?'|S>@X75z9Q{5DOڝ@bÓhx(e?Hw=,i%rc'Hֻyo >/>Kdx01nݕ%SH ՅIp J4'}8㒉1;F_x7'֔j')jd|dP*MxRp ͵͟ۻ86O/a)<">cEsӾY*gg[Ef:`j.MbR vwjH}=@g/;YjO3*DScbVe@ӞSv]1G)5_iM.˙ >}a]#0B ayT[R|Dx,@тkqo1Z fܰB R*@"YAsO/ўߏ9jxDǁA"p.hAFV65*B`+TNC N==dT}&7?ӎ*X_% Ь.VZ_Lۨ8-';r<@i9 4Zɻ*B/`"3O\0qH '/2eTR ޓZ!ш)ZIV'=(cwDr%6 W{4jgȾOdROZ=1}T=qءgBGBQDQHƭdPKcEP8*':QcˑGTDQGEVR]DDQ #gLǤlO(%.WUHTKyKS YKOG)aT\D= eT'̒,^Q-7 װ'cg~O ɤ-<}ZtV_דz T\Ӳ8=` 8_L|wAEUfU)QCSyЕEl6Ýq~(c307-hػv~ &0 !|z7[y{|>`)4k_p¥%̞[)82 &hҼ E3?oGsr[jaXczkv0 A*8e[mn'QTy\<.>O/zZ6"\=nFv[VHu5ŭu= 䇞=u3Һ=aeݧͦIƟyNϫd4 sG꾣wt<[ 9fto[2O߶y(q׮(5q.L6PbSѸ Sq8UW*Fr6$L"벋}q7wtgas`{py?G7,Zt#VLKp]9⬸")S /2I%!5 N& 00;zyp <]hwXb,m&$v>~/Něˊ,k_+gv9hQx"Gf pNbrp'uYGEb0 Ck#Y-pLYP^yjBr?n}oGT• lÆ ?TMnxq2kP)lFb3LNL˱P)Lgx1<2uc,ϗfy 8dL71:J2=$uVby \qPbuS3A 3[(Yh9$3DvJ*aJ9tYHb*C贾5?B~#RFYB :/h80kPa)>%Ԇ/ lA\w䗿?,>%˃ Rrc9Go?d4`i߾o̦gW~s=g?X+dNFaZ~~dCࣻoɢ}(iIQ(U+ AwiLo_]rZVI?w dKYv=wb׆CdųwgrJkdtW/;T?T~zyͻ=ڍ_׷⍒ C&bv6W_Mg' 6鿦v☦p.:7 zF+L` F$Y}h M6s/ܭټlqrlPCx?O [iU>7.s(e%ה˄R}Fx~O:W[3f~~?)e@={U)ܽxIWO%Ώ,S3gzzdf n\s䑮rvaӥ=X^xdD-h=*7k?_Z ,FܳiuGNMm*D}q}[禔k|{%2 eSr!2jAԖ(j349a"+;(-I]R93j׻K(d,.J]HQݥ9(!n3q)/B[Wۚż1;Y6NS՚-؏vmA?7J ZoK&jKjī捐\nIn'<3ÊDlʲ}՜a:e R;FW2Ajck- 5Jb6hZDc2^4%lǮN)2:ɀOc@o,HZaY}`&kmц-/g%RP="׻bGWm|ŎV÷^oTō{3(!cLRP.UTJıH Px7X- ;8f<[DxZƸ:_`X_(HPHŐ-ŃU9K UP9EZ'}Z@%cB4fig];0UE-`K{mROg,f@+)C*#[ޓΏ&>^/ƨ"J %,!R1LJ!{˶PbRI)(R֌PԊW1N㤵*^1DGpV 6HRV-U_P";v*h8UFL'*2F{h`N%.BFՎI;Qfg">6jFzv;",,w}0O|(LE5>V9{qb1 fu*HB+&*'A( Ib4vc^Cfrua>My]XnoΗt%kF_nMJDMEY%$&/lBdHJ\_Mk &_|w1.~]a瓐f]dVi̓X>k[z91,4A3E @AFC.deDHNS!mUH h:'xzT|`x/y5VաnaX"jם#ͽ0SJ?+Сws>f\BO?M/nKl+GĈrZKQcx9z{|\WV! 6ӓfg~iq'%k"jVHHNw'dz#tdu'OCMۏ=18<ȿh僆nOt=J6ՠXvt&`Fg*fYq_#0|$3xeWBV)mjЃìFXv!cV)THo' 3d4<7zG&JBiq5F#eq_a4"՞.JixUQMYWHǿn(d+bt>_0׌<_[#yWKCelu|Odֹ 16WR]Vsg7VU;+ҶjWރ $v @ҥXI*!v]=0&J\blKFdȖJ/Obדcm>iSqA%8632 ;֞8{vX+J+8JJ9£µDulWok[<\18W/ӳo;+S@(-T%W ~`I30SF JQ)7mtAA~ 5I AncKpl'.)1-~J=v8O!Pv3VNʹ.EVt#TgGB$t[@E.Q $DOxD|1HMa} !GȌ %r![5)hTV GQ|2Ӕo`LaԯU8L?#GS̚!$`/đE d5V$TR1h\NGRcc+#ĖW :3)FZVb Ѝ=b3qf~q-kH{ʹPEwhɒͮhG*q$)gptM$g ZTd!R Ɏ~~q0Phyn֕ [ZY|_W3r#Nu\Qَ{$ُ/j){eOŬ7n|.g2]4jLg KC@I5 -:;b>/!x,EsgF4h.}Cӄ;rMH6uȨRQȢsi,yQ *SrFyTd_(8z-5F3iOR3ݢK§_kkd/6.|2&BRJ)Cb ]( Q0($ep4&8_tz$[wmCcBA0yеuuvyU8sɳAtE C~鍠㠰p'(: \T,mN6EH(Y$] 梒2 *f'>67)녆TBu`&!x7gͼzx{;?@7}^o%4ߊ.="8B>FuLJ>wtbUizqT8zQ.dGTk GWyfGFAҘ>9H &4BE\±YC]`DsNKжbUH(0$re!,PIsJ t6e]a^JQнi@|Fx;=hܦ}F#Z+㧥[Q._ޗD.#:z!td <2=-.iAC|ƪeL b`-O>3Q]aP] PG u+.DDJbeQ cBbj7yȁHLHycJY1!#J"Q2Fc(ӯzAlVkx'5=)l-<_\S.=}|v>&yNդѷuqJdS1@Q\UG",$j}?Ň[r>Ig{6W!W{}(i{sI \#اFQvYdْFbKrfvw?ǦoWh^oq¹ͳ,XI^+2 2(fH2m1hGQȸA(唔Q"D1 2igPrv㶳X,a7mLφ)xAU<q7 r!v; #ʉjso5h}:Zq&c[ϟ[ۻ9t~ouhun[=/+ڍWTt<_{vs--?G;%]Mt{lm]]m %xg~p[ڞf4YkqE"JT鹣%x_hDuv/fČlLAF%Pv~&?PY`Mh̽vQqAy 9~gOA]spGRA%mh 9 6a`6Fx|0JoBfxE^"sxLV*btI q WZ'eIr.^.;owg_C7O^|矑 竗oӲE$Nïo'~Vz΂JR'4`H fXCe̯c],NǗCy}`1,# gOG[H/,I^ֿ_N}\:w_]RrĨ:G;iE|= ɤng VGSߦ|PwPVHQ: lgB$,x,jɪzjQ1'BY(m |t:LsyKl|֗TgrR$%՝aACm=ڣM>_F_ǓqĿt@[<<'4Eb5j8OWH:puǑ:-LUvz4|!5Dm̆SlΰM㗚I C{cTSNFEDvysʙsloo0Ar XoL1 9ɵd*5 X,h&ۓ:}~Omv.cIxq Qk2!4EᒋR"'D R/:z}:+Egg-Հ.j)1.d_<6cK}~?t֛f%|xu^W+OQ ^NvK&()| ].JM{K@Q^^Xw"> e찁,#d; XAG((SHRF Z+T І9k~?Ho7O:F]^=.3UC9?RHc)(2 cʹHJH` (bip*le>fD:e/ZEed}zK> 뮈B !@'bLKLr[O8kf46~pO'Ecs,]ߏ6X+R"AtEs@Π&^NYa\IILio|(J胴BLN92uWØb E'S+nggùfz0`)SuQחA‘o[<ծXyR|5uۆ?[JQSwEqNo0c.(by%,I+ƙ4"!G!8T%`騔!\"Q"Dtkx|pv䳚 جB9q#NFཱz38!(qK2!fNJkTV[el]BAN?1KI"uR84\&t r.]΄9_QXK|F)O@PKQ[sق ETJEgb E`ڲ~.gJFEf JJSPah:,rDQr9@? XT8yXiu'f85|0DiB.)U)@9%4"YL YN qm''!z{jiB qAGu9$jBV`.@*0ό,h0[Fd B 5\eڤ$D"TKpT ހn)bG)-gaYbUG-E2BlUliuJ׻el{l]~?|W[(Po8 9~~ߩR$w}VQ(H3:4`F e|\Ѹ Ʈ)3fú0EYlN2sW5462= 1S2 ky\nyY,(U~0;GM^r8)^IHN7Ke}<_.Z.o!&\ІMm;c$:'-*Ji}fQtB g<. aeE㨘a@!['QQܼKAv}4y5z;m@4yw㕿l?M˽Wpq|[ XpuB;O{q!-nLG˫\`+&v%DKǙ-%%LhDǀ*wTy*sN4B弰9{BwTD0x%,hm~/7?,’}c;&\e f/;Cl)/stP2L+ϭ#`IQXw%q$Y$͂i/,v3m1MO/ ;k8maYUI>f>_ԥT),K|=w 3Aw޾=KLz8V|Dߴ3xykrs ^Jo]~0 J~"n]5Wy5DZ@B\GEJ-cbmf+?s.Z{AF+U5撇Kq_4r4`{w}ߠc2}=UlAAzlޞrzQ +v|Vq+ߵno{MnZPR8{2>:LWތ۳76>}06 w} }m%ij[@7zL^jhqQs#K>),5?86r3-u'7^C|jtx3tĤW8&o\>y?c=CЈ2w3>D<P}u6nj]H~8sa4tYoڬp`xM]GAR !hJޡ+k;?§bJBƀs%u{A&J꯮0{+~2'ɻk #[I>?hw&=&ϦZԽk #W)kоE>TL9Evĸ{8O #j_3>-5y;:Xz9nr,z˗:ܕ#~R-Kq˔}I[i4*Ө(rd梂Z VUMp5C"S-Q ´;J!z;đ߸X IfY$e1E`) 1!#%Q'cJsW}&u1!5(_wD,Y8l zYr: >w9X6c(G Lf(&`~Z`r(S( Y:=";Eqf/.i Z&_F-$>Bndbqzwc22+v= ÈK9*]/?UySMu"YY)AY +Dn b9խJqu71қ#PfQ$d *W[ĸHK씩8O[ZG(y+"3J:rB]%J%lՙ d0q>@vP42]|2GslTyRJiZbI*Q)bGݪmlXvs _ýғZy|DQ{OIH80"ʵ@!a_N?B >j0j sP ؘӿ 4={9ml>rZ6̜xtvzrTjJ1,rJ䑷G9t]:hTwrIhhȎ|G=s+оm0PTL4}Q{_}}ûhWt~g}N11o+vdrd.Z>nsXt mZ~:a10;=>.N߫]N}v݁"|ae 흀قݮmjgt7''~*]ivy?)yq9;#ԑB qIгL'{:[f|-4pM=R`ɱi>{׳JG/??;sYY'/b?}WnUP2MWogkOGGSҹ-p:{ ҷtz:t#WW^ gmme,9!.k44&giik75jM[>$ Fb9Zo1+ ׮3Z]yq%3/7_^472fεv:5Y:n֚.*.FjFXU׬W-QeɅ7w}nǩ8pwӚ_g r8/PBerДe$=022t|Քÿx6I+臞kiO˫ iɯC[8}X.s5PS5sq5ZVvEfh b^ F םr^\Q"h ]H!! Y 5>Vb2e%nلt?}xb:~keHHfHim)htEGQR1' 'BVR5S1xFizE:OU4~ѥZN9"U "a9˪ D&"..ġQMB=䲁.&B6ƌaR6cVGߨd915yQ) ִk&(3{볅Ȓ{LBR,n6tm,P,IfhARC# Eƹ VUxtM֠F'•9Az-#^o\#;:M9;9 \npօD2idȶsZ%} ڸ'S~Z'OMlC1o7{EfU(9P՚DA(I2F[T坨UO1HAY}nu~rC!6#z=C]+, AbK7nG }RTaK P>冠1@ WXgIP!U.BH%G0I,e] (i6)mDlxBf  ` kւ:`e*.U`;J?pZHbo@yUmɭ+2'(tyT*8ՆmUIOX B2hVXg˲fg19./FV`S[Vh4DÜkFÆ2PLЀ6WD*%U*PA',N;;YNUg%TJh*1!kTUW51I B=A,|ڜ׃bFt+WYK4Bi9.8tHi2bZ{9_Sl`?$HabF"9դHJ")jyHSCouݺ{#s4pqإn, $8ns` Ü@#܁?U2ӉTIXsA.;mt7Y ̂5*)8BC;/4ͫA(U9u[zˬ Ea燼n4+An$Lzt`_5iI޲0!K-hLnE?Hcy^ڭ'0:MG]4Tu2m;^,ci$V<%"xMW-d0L&Sns`*]ӿ4,E4^Guk I<5jhx9q=).C;p{oh3wm xS9mBds\@7j ޞl;(PuHc-r܏!s6,ѳdVl,2aF#z@pkO ,%&}*` WMIB1Xɑ燝r5WhS!B[xYk}(VqZi# o7PX?˙@̠:= ںO#0%7aDD[9>X2Arl6jgC1IĪl.u_v)&d,¬l:H"b C(:)0Z`4bM4F SqP P GEW+n\'Z8zP@ }EQ\6JvJTjf*gӫ~Z}ʮsv>tdm?k|^죻mK7mvϣPgןrwJJ &K?z%'BVJR}J ։H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@z@Xԙ@؞Й@ٓQIG\kT!, DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@@YÞ-,Q\mNF dZ;z%@_q)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@_Yg))ApVzsJVH * ̤"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R}=JG?}f◳7?[iku_^_֋@S)y:?KpFeO^r6 'zﺞ4&Ng]}Yqx]7a{Ry~AFiz~Y(}VfݧxƤx:,w HvNGĀߛ^0NnR4ޤT7`A-dH\2L?SV/gki1H a`Q'Upp_GF-uT;逜j H`PʾgcԳ۳䲓}!aV.9'||Q'Q'e/ ~ WSףӰ=eg1寜|n/m}lw-Wqw<>jf%~O-8)u + 1vIHqT鐶fj,QU_5r.jyVV9)cX5DeUzcs+} 0^ ߖ-]fa@>fg܄N&WGMj=SU~W;rw>m.wU$O*_W/;eeldQItN?Z\,)+?yos+:?z]Ղo@h GM!|WH٘=azi?]>jOc=g-Tsy /yV=]-Q "7=n̺CzP(B b_֣ζct9/,܎R^>Ori?NOLp7 K~z _2^v\<ݻ﷕z?y{uc|vǀ;;/qƳ/^=Ѷ;_K4d?KߘC{jl/vY[m&ۺ'{3~YQ^}V>󝂑lu4M1`W^^n/ZJ'( De|6u-Dȭ m>w~;#C>i~<8^ml<5}ϨϦ\%w1f?֘oݕ ?L.oǮU0XC9R8t>]K5NTBJw-x>pz[rbf7w%LG׿^дehS﾿PGӵZ7gywE~W*=L^UQi8 R]tr/@d]>2uNe۵-:8fm`"iSd 0L}3e #@1gXt8>w}KMKr2[[U]%Q7AeE2fU-ɶHSӆsD eU!o6¿/l-8&ԶyZ`{>@]7+]ثM s_ߌ>,x쾱.['y ݻU]]̯Y#^~4nˆyrW-ھ6I.9nR֊q+O vՉP` c.C0̜GF'7ɬN7{껓_yN a^ƽ¼|U|Bs.^X~ yzeƣ{ڙp=d~ڕR];sQ1VVB_'G:;y-ãN\`!뢵P:sǹ灕"޶ wElHù&`<(괥RX8:KQqs 1zTᢲ\Cu46 30̜=k#vc8_RCȼX-,w !iJZu\9?(f`Dn0OEW'\H<1Hl+wL-e. J͡ݷ{>9w?om'dzڛoSYE4*?-ruy6Rf] ɴ8| (d%%3hd4֜km*@)posQU U8*T#`<*?Ԗ^?($Xˬ? 4&iGx/^N ||.n.WiAٰ8yn&~:bL;OGefiKޗ7 2 gD%Ӳ=?LwxMYYeUM3ʸ'554(Q{ʝ@Ҙ`1qSJŌ?H30w9,mrw7$82(z#~ri/Is/te}W NWHqՄzp-VpYV=3eIY{U!j(xKKFhr G'n.{ oc< u`hKY Yt[3NKNĝg;WqSÎ3혃Z<~Y]:ХG W[[mU@@(XFCk'|GKy'җmhH^@=E JWK BL2)NBN-ȑAH1d,[ %ltNNnJ_o9;g'fU~*|nk٫O_gj x@GOZAGJZ訜B6Cqvsz<BJ>j>nScifIMٲFfHݘۛU5sa=\iݘGW^"wG=,OFW(^v\G|z~fTQ ɸ#BD?[rb-kV|FHIT9Hx2J!1T䌩x!HR2zZmt x7gdZCoG)a.P2B )$,xSo]V4uON<+#[N{o.ИIZ݌}Ztb[7mI'q@vrWbtpl 2(2 uU"d߸ [t%*uog‚aTi},a# k©@G ivB^v#μ‚uf N8I=A'7_bF拆%-BYUadvk!..0 i\/.;=b ry溆e!G4 }O[z-K1*k˩}CFn?R} 7O]f}gI܆}s1x;FhtE/LpOFV4rީci7=;3 ~x0b楑a2mo@{ކjxK<xU3]n+w!e۟6viś1O.\rĿ|m\'@uiYc#q;g8K*+ I(!٢H<>PۻnypzԽ就Zi8< Ie iS_3ϨW`pѕ q‹̧^dJ3%7 J$J:mǥ ma@eY%Ε̅ƒ@o%rz9[Nx2Yi梘Kk 퀚q<ی%x~~Зҭo[_P%i#b5XMQE,yaS1:zRL8ޓXv /[b%C4XJ+ĐjKH9fcY˒_9SIɴ `ӍM>֪ymːz=Nf&S"b EL&yYEI'6(Q,,Aty \qPlyS頂SϬ9^ԣdQ%m$k'8 SΑR,f xk0w&{Xobn? =D{-~xyDvPEAŝr5jJ<d ZhCE(!d)N\14U;:9uv 'lCgm8=n"l2P}ş˿GBp2f۵޶go,PIz/D+U&Jn"9h:i|ȦoQ\tBv:|\ 7%ut4{ߙM;>H#i}?b%^z_]~R;l~rF`2 'nfjE9Äk^ׂ4KzߒoLEV͍i{9=rl:g9<)'UGSh=dZCz9ԅۣ;y;3xҿZViwG&%'y@[emƤ__j׆z&mVq YM)ޛwu-- ]i~]P퓛vyI:lnS%G>:dQ(S&^@1A6ygy'u-PYx}Omv(.-YE(d59ϨB<dK{29t uJ׉+YW;㍝ bNwz+RO:}?O7g}7ʾt}qn~ԩBJ.-*i*+l:CGJzSuzJ-DK)D\"@'Q̳3ڂ>c!g]8P'\]%nmY*-:yR%B*ȬI] I1팜m O]".Jc7jMfՏ6Xګ1r ZoYesτW2NQ ^~j .jeu**H]٨9>me-l߀NO쑀NtdEM!jo^0@"m f`>*ZVjx,˕;OP9SҢ*KkSYo>]wvݢup"ƮۑN|g}`֖-Ȥ$#s/DƔנ|)$23^<dfd%Bx, FOړMg\ljLH30K`la呬Qjgt[΂c$֢#[Ɂ(U! AvFl dT 6gL#Kb$M}h!,0Ivb̌71L0괹Hl+A~MR̖heej;T07O[NAkNgUwWU] Sq U÷ʮDTh?N'jz>̪x:.'ǪYq5=M鼢xb_~Yk:|2:PrxX>_KRt.WW.9>eas"6mM{9^;7wWo;:W NnU5!; -fjh튩(Y(vYWi~Z\^O+Q qFm 'j#𺚾No[֗{߶;Ơ֍B*Kx81_VW.z1l u [gj`3d6[j^6yr\чmh%eepl M b@ [2Ns2x)3 73z9Y ҩĆYBLYܙ&-D%b6 /-(17).c$ ?z.%˕Uד_7o%~.9A+@r:\R{&,VMH.a ǧQ >}S5DXU5}u1kbeOqJ%͎suWzw^A v\΢5C)# ͊kX$V,߆}Cfzt{ѨMJD] pՌD1ItңOP]T'ϼa cSħH 3:;-Kev[qQ^5QO/:tkÚ$#5CkKrVzױGc񀼗2mڟx(qJB[eLa@ $r#|bru#W,Z0VCM N4(xt bfRD4Kr,YѐX@&NFOdY:5B9-}jҚ㉶!)0HA E`rs+m&5•ܜ.k\ 7dMRts{|ߥVoWŨ2sq8~(H˧w[{gKnۅK]wnUz8&B _֭(GEL%\CT8o|'Cq ,3;^r G>UbUU3W\[x0 .۾yz4~;@MnHW۾.8Iϥ2@n)_+Ѷ:i=RFR}oHkS6L~Rjݮ QsjE 8JV=غ ZYӘ5iitsoDke6 ,DK#ł)M( ȀMݝykt~WOm ܬصf7h1Ed,mO/H/DBe|^Q?'auI^z[g&yN?'iҧKvI -IJ"B8K69>a U[k4qK'}iZ>#2-vokAf^],9V~-]2w]:WrUk"BT y]kdc{*`O ?U0`]k W5O/~oÂ沴zֶ3 ?ڌY>8Z=QKhRqq~tuNo܂ٵd&@a&1FiuUb6<- '[E?~F%!qP=K-rm1(0 zGJ%"z69?wja#kakַo̷5iP|X-I(Zr2Ңjzx~a𜴅^5x%(⾕yҸu%+7b/VmѴ\@U;zy1i uEhqka4,ģA %taIj32)WUoNlԒ'Z:ÜL 'ɚ>:yѿE/o3w^ֆ{~Uy}ТgYsgŖG4G߷&} M)]dsB -_N@ʁ$iٝe<:4M^Y[Q>mK +߾CP5OM&\6(g(/~ HWm@h^d)O)*[XrKjz®іN2b;:U0Yk{]#KaVאO̵fݡI+H:Jx_v`)NmũF~o娑DthmJ u/Ü- J U9 7r}~Tٿ1YR&j\BWhĝx:D+nV&]b"^M ܅m_Fm1}k]e]۲dOKm3?]mM@=i.`l;̩G#!fƬ{ҡSp`v\:xy`|e }6"{&DŽ&8Ι49QextFV(N|+8u\]SS`>l[lN5DBLɹ]͝N._2=ZUG1[+b@YE+^n/uSYԆ+ᕗAF*];}D- j=5  |[W;# 29E*&QߔMM%~ PŒP}RSr)AB!jS%c@gX*D ͈Vgs.{g+5Z.>TŨ 3tREdY&L%@R2iae%BIHJBэ1E!Ye9z :AE$csVHE)YX>T;L@s!Gv!djCXt>`a[ow2ƇVIbş f%A3bR.a ZC@-\/\ٹ\hu֛L6.P$GKsPT"Ra%wR *qֆhvI)$ TP,RRR9JM2d`䲥1B$udԚ>aTfpY,G]™depAsɁ.jKɴ}GBWx܈<d2n.smk)"Nb +g)EuL. maa8 qF@+˻5R(+`)$VhBPM@t4[hspq*)́нN7@yUA[8X9 Y8Fцne­ X(bY{qڐfZO}0"̓ Qj'Dj8~eVƘ'e:Y1{DzYZ:E;ZΎE, h >nV#C J,K "(X6 Ri *\lQz#HE<$e,(OoTzCeRC d0a"0}$LLh5 ͂,BFƄձYU+D#5"j8(= ȲRN5nҰkMk?fR3&if 7-]b_|[1#9I`BTRXtD1cy}fk[49ac[-k.Dj#jv0^ BP88(o:@G**Z@L.H􀁩p1$;`7-ȗj,(t_@JPI"&L䴬yE JPaZUc7"A)ЭLx nEf06mƂٮHp# VϓW14+JXm))w2<}?nk.i?d*,hb⮆&X-'G ]rכs@ MހwK |Y]!jUW@ @֘܊m6!%z\X@+f @cb- FyN ЂH[Y[cv65H8K-F|d)ebzc"Õ,c< Ü@#܁?U26gZ 0] iמEwi`J fB޼jPR\ ^`V֍Zzו@̤A@ |^[I$oYBeƐ%`&n6b1;sp0)[%7CD6J Z$ʮY.1eBAt GRT", wB)WfFu&iD +Ip xrpl$r i&xyư2apaBl;B4Z1<*IUk01 t]B@2=BmX)+g).XZ#e̍LmFjmRcuz@pkO2gQ˃>0t$ƀ?!\,ܳFrkP*m^-7FPٗ9L!=X@a} 33#,+A-ta@+hBl)#"8w=J7j!$bUƸ-:L/cdaVM6WC`$1D͡ KN 踜f LBF.QmwEC@Tv""DoaLn`j?.z?,nL.W) A1(nPRU%%Tm5S?@N?߳`dTUsap Cn Xqilvfϻ߿hy΃寧X ?*%rǣ:J %@N)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@ߝx1)`ǣx@&^ +&%72;*[fNJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%)H DJ R@"%з2;*z0CJ' vJ+*fz )pҶDz~aihcD,4vtZ/P./Yyߘ)!AˋvqR0r&LٛH @m?OhO&``ɵ+m`8(_v9 $gZKF#nj\ECC*1?{lqer!Md9>[S"@}XjH 01ߎpd LŲ `|||477lwwEr0wh  ߶e']cv~97kdܱ '򄙎3=),q.?{G$}%Y-P1;:$(-%jÞ_خ&ӃXi)c{(F|4im_ bnj#JqxG+-Pΐo4f!=Bq.OmDm?cykqhqh=C5CaVcvW<#Klb١/2r~n߷[^!:Y_g/SpVp׫cq[Tq`읪jhg.8ņeg[Gg㫳Ӳ,R0"i\⹖Ư"rۏ.'wg*r+C0\e/n\yrvgU>%\._sxnZt%E7lQ*!<0T;y\^\K>~mN.bN~C> vslOh?J; ^7xi@*!էN1a9B]J|"OLY}dvh"3_60)2Z~IJ ˙;߈X?eþ=-!w+(] /+ڵ"G˘U;nK`x'-J+y& .&#F˪Bw3_Zڀ3.bHM5k^܃]Z>{u]"nO^mIˆ͞5vc$o6Н}JyWg9ocijkF\hKm^']rܤVD#ӉP` c.i;n<>o`_L&uo;5Ϟ$ 6Njܸ\/jԝ5-s@EV= wg>B/7p+YL2՟Yη#Bp6h=}>@(f,H4( .**Қ`I(=4BʽlO(% Yҙ;=.܊evɎȆۈ[YmnIGtّPn M/?*qA4 i_أvlj͜'jn.KGp.ΟqmvW;{lݎ꣨.ٶ]ξ Zˋ?fۢCI o3+n>y9;*w-M7ekV|H+jNVag#eֵZ5 (_qhrlpdFty<2*ba73r7rNEcҢֶ˫߭7il9mȣ1M]?`h_]) nIZ$SjYZjvH%nV/NVnhd/ddB6a> nǜ!CyS[d]͜#v <;%j ڄ`yĎe G7LT٪XV*АK+9~LCB5u !2!WY,bM.,QK1  z w3x8#bK ǾQv""EĭB(ܠje,M6ݫd-2VHQ ֎9Gӂa1iu縨(.R\ܺk):vMon ]l/WUF&+LhJͥq`;{yǾtfx!l_jQl11W:I4hrɡ؁Y%D5l>r'{ AXl>*Om8Y( d[_ځdS5Z$'@UJ*PbS\Οpl=3kGzgnqW +a`Ms]K14a8ôUфT)TQ8/6;E˫C!|b/yL<|.߷)O[˄l&JK Շ[e[R>CAi*y3C Q #bhTfϩB:;D'*ZJ%ۜ-αd2{,<4s"*RB'gUH||\HC>̪xVVjH`Gs-'LV|[/sG!I<ɻI[>L痏/u]fOn{"YcG+? AW>FPtv9btrNKUw/pF;n=0!jTke\%)IEyۇoGT ҕζJF ;Pt3`[>~HŞzrۗAW:_G)`-$VjDvbElUl!R/K} Fˢk6هT-:qt!{b; ,9ַwRi}QT`,+d9IHBE%c6_̄{2[ZXdB`.CnaUxxzٽ_jwN@K/鬹U쉬J߮ /n$ㆌ~CEM_珣sY5T+D>!dC Sy]O\v:2[ݴ_'ۿЦ:ld&򃚉hj {6od *kǩK*XɐDwZF5ۧ.$n>d%"ht}et5}zz#.<i6q3sn}mwgXPjsot{=;U#x^ZzYkw]fs6oYrj?Lik&wƘuO]zJW!-dg3SEx1!2-dmO8&LJmzg•EEY OŽf1 J$J ]AsOd\|00 almGYӊϖ~(Q?V:1IP%i#b5Rf 뜊1 nbIŲN#D#^q: !,GL,DKc"!9i }Ќe 6 dcJ@ɕsp%cP$Rh]&p툷y\1 B ǔH9]yL7;xeguxܟ⮜_`M n6jN>rœ%foW![stJ )l"k)s͙J:MXsO;g8Iبz:i/C2J[؝L2DP%+*JR=I'UAqb)fi;ӊ)pJ zHq>% , %l#Y;y`xpt$]u`Է&Kz'. 6CCNmˮ' ٝȖRJ|)yGSBmB  D`D'9_fMJ?j5O}kiz}p=RG0_{>˯\{rvӈE9ǧ^͏dV>E矿bdxC%TY8V(s~ytӧq"ʏ~Lto6J)m38GҌϋl)'q:rj4}7HKqs\x`z[~4z!?DI5T_/oԮKar@bZCGeīz1?{eƺcG+,z,#pݿ>{na=:dzz~[zqIt~\LىN6)KgrEau? `JM}A=!EN)JX!'0\ +4toKuDכbYi5Gu0B"'m v]vGuەB*p!UDH 6)0_t:T@ [9.}Œ1 6,6ypѧb3bIfA[LS{1.bpdlYTD1&e0^y4b2Z06*d&1c^i9hR d:!@p^Ɂ88,:h&qr\eϸi!Of0ֻbeSK̒"s-(GOEq1Nu!EMk)-FO{",s^D7.sBqX$=$*&C+wEJ;E,X bF&ґ 2C2`u%<^!v##Pz&2',$߹d(Xh%N,grG fMF-[@vrROI>2zYrtB*B֙yZuZDRc{)7 z_kYN!JE ftu E׉uўx,آW1HGkt薥CNIYX܊TEEO]ln JD Ei`' 6v-D)"d^Dyo^j/5-JPg߯U+LC??heO{c:gz=LfpF}wYؗ=^У2 vzoz$ G 1ʸ7WKyXRV-hU`W]JL4R]Nk[:h$w@ <|i݌z =g|iFDj>3>f]PC e C` EzT တ99mP9;&>d^˝5-:g")R/+UybΘ@qϭhmAI˲c.t`l B-2_HRI@4X2l*:kȆ%@OZ"GAܻcP%8yiu$wbr֡v&2K,a% ^Fk3hZYr;?ҴS[A* QrVz9%LQ8Jt9xZO1tԮ 2)$Y&I.XN:Σu>&Ap=QDӇ]Bp&3߅g;aMLD_RQxIɏ ^jҿ4;m|>b-ߎ2fK7_ 2 i4.5+7Je}=?Hx[cv3& SelX}ݚtZ0̛㌆۰&yբEq50]XZ[Gq.MoHڻy+KW2(G_O sw{+kdpFrRap-VEgeWdN.x' '))W'xgO2; ji}dP2"LHar5 ӂtKڏuJ^E^M7Cg}l奟OоlZ右,3aԜn>+i[S [z37uqwԽ Pyk\n{Ǽ# =1vMkz.tr#Ti7 424uO O016XG Zc4Zb$F#:wq~|a"Nx oc< u`h9V+ABV (BրSRܩS*`5Hq~k7] vTگؼf?]?P)uJiťG W[[mU@@(XFCAւP+{{iV}.xڃQGZ"^ S 4 ȼ'H6LPbцVqPvhA B9$CgJ4-f6I~V'7R|X'qRkPD{M!Q2JΈJW>Sה?E+W h=kT?8vygtp0tt祜>x[haV IViGeL J8 yjpd 7!*_RFAx#cA蜊`Q,:ER֭4vkj}5]I'7GUĕ/bզPB9/ n. |Ԙf" 38F(&JH#IqsCg{'=`H\QLRAU KDc253+/0"=:w3;%z{ c5Ӱנ/ӓ9;@ V70"F^x6_\JTqX陿g2i,KC:4#XLjIγ>vq^)%7yrmY'kmIx]KM/r;! IUp;oB!gh՞c:|TL#,8ZI@mӳs{̘U`E;ɴu糼 I%7硟h2o&ip$߾s|wqw ֿnkvcZq+d@#ekܘxCk 1yN뗚ۤ{ȽmَqM5tn2OO LUr>P ۪خoA=ʘB;z==]lx.5bBQBS&PHh7S0t{J_ʸ C)Yȕ_4/$roa`VF 8d/w =ZYK0XfDDH(H,HSE57V"FMf:vF]5h urO*`vnzm\rakFj1I`Ys}G|_XSmF,eXR*2%%}GI6(Sr!(F21G"1km&J ICcn;ioF 1K0㥶Lc |J!tE N&xw|+xL\BfoZ6ҳ:)W@~:% :ȔS sq~ɽ (aKu؅O @'Adtny)r u<ښHnKQ; bژ]|'}u|\2:}G$bW{u7@G>K\Tx+ZESWn`BB8'LT$(M'\#FZ b 2qTzU5ˆ&ܨ3JV^T8*bBySZ _ 0|^St47H*cz%q5QݓK ū(K},e< qe>I=88L< wAn=:' Yg0ѰQr?LgQ>F>n*T0C8tнlޡ$%8KyuQGWSPej"9iL ̈́Qƍ5EgR2$ b!o|Hf)8i BFtH\`(aΌBG}-;yxVEBőpUaw,x~]ep "Jfdo24Vg-<'[cVJ(=GxvF+0?֜J# :fL1D25c4Ɔ0P?XޱS@[SHFm$ 6GŰƎ6T,^1EsHSFIt eȚ 5P1 9IŔ*6TS^!A&{%Mh&ub $n& YoZT( Բ _BOk%ӖAHIdL$AXafWƀ%d*BzCf#Cm W0]pzњ_*}|eU,ws>,K%GNפbs\gf݇tO%c$SG-Nc~( x#R"S,jѦ |(v%a9Vu9wY>߲sH)0^gYuOx_0@Wewt 0lW]Ayr*NFvԕ?a<.'%h|LU&\u;[bǺ] ^w>0m0y!Y>:˽b"[t[½G:KuwKkE[DeLe+)&Ӗ< &.y: W;-)xzM{wo\3 YdǟPjzu˷cn5'sde2/[ꔔރ3jj+]6 _g2s~B,=>>ݗ㡉Ba3Nc|:, #PNlawPBjzy^Uj"^NhKZyfS)TbВ2Q&N#E l0?*_3iNDP|<+TN.Oc ZVrKIEO.IFRhU9P;7, gJ ]u4ȏxzm<#?3*N% -Υ^/?Zݏ :8ndekSEԚK׽YyeP7jOk,NvՓ|ߒgk2n oFx3`vh+cv?VD$.XE|=!.͈1B#W\<L[Rާ0Sג+jL.q噺8q8Z<0-!D09'zrE J&u}R܃JTUWk5g[qHdx3DJu)qDsr|`zрҜuGBI?G>Ey^TivLy<0E4sL{rL)4c'qj(Evz`9g6X9l}- Wz"[bThzruJ_Tԗ5Go{L?i/`̥p=+:R]5iHBU "0+nߋxEXȿf dOjoIs5;{lޮŘܓ}LəA Y/dRQmvo كH*ڃYqrj#;HCB ܕ.9~#W)I;Vy2 g~i)W,P‘o2eJh %`qҽ xv_C9얕3f%+BUQ(CozrL%4C-ԆrĊddZZVE ZU}NDZEDsV35`:6֩i?(S0 NoSSSus7%^R(׀:l]\K+(\L-I\Ka>Do}9RMַ3ƜK8A$cɰYO(k #i Ŵp+ rZ|ˏvR(C_l~:[ߵ1Ujڨ pTa@C6 S`8ɂCIfcDU q-B0Il +f`Vɶ }2Rde$u0Ye^X8{YG zqXxlQ';>rodbf#d[q&9zHX)0lHD(YP쐏DF`[g 0_H7Z%GεQ0>EAɚF0o 1ε)k#2!;&GIWn-غ(9)[X<5\Tؤ B.JO,3D,UsQmĥhB{Y@krZ7'i[Wנ::WHur#RGO2ToZb(_r0m-x›8Y8_RW␡ݦQOJmz.*ǂf6,S1̛-ti|HYbVRSOIH0k<Wfߤn&+0LnخA5k}>Zn$dpluIV%䯙2VNg9;ɕxfQP73>?mu>a.vb"P (ı̳K^>iD 3yP$ņ`Gw%mJ0t@¾8‡~mnQEŚ>*R,Xnv82%U\LiP"CTRX y"qu΋mN-(AOWu OW$/4R?{oQy[ߢ?(/45 뽪z /ZEܦdo>]hl2'lrr-f(޴Y#v`K^xV\fR,E{|d3(oz*pIQyT4vYV`gZp [`R\J{pV@|)mK(Pڰk!Rf|/_G\~s|)8o߰sYK…z%[#uf܀aѨt 1M\gY!.tvIi5HiqWFTPIH.a 3(9q(2IeHSRtӱaXݹbR||ޞ@1k0YOMgL !1Ҧ!@$ɒﭮ!;l"$u7@7UM`J4gPP,Dd)8hg*,$MIL{,,XZB9c-փ8{s5^,* p|C"Ƀ^f ;RVGbߴPcZh%KTaL1ޜ)e}يcyՂJn=S "a.ܨe[?ϿТ^G9ﮢٰf[ߖ.J|~'pA@qΆ@VYZxwXf5Z^d/ ;1\$l ҾdTK%mvL^ô3t1AY T"[`reБ|>3t.,T.?qĔ¾تo\>VLbo{`xg䒦>s߰՝Gy2+`x9gʰӂ Î- {/0 ԆC]g5gM=JXckFpܰzR OՊ˗j~6LIհGUCrm|/L6zo.U^T]~H8V$?;0H&.3첮19s? ^_22o/9GpQP]\<>l"P\pӶHŐm o:' `)<8cӜPb\oy(OYDJXQ#8N;ɍ+b EC=G؇-`2*~"tѵ^nKzV-e+_e%:pb"˒(>'ӱ^kV39A5k)KdxttXZ>]1$G፤>LA&$E$?] ~8gŔ ӛ\ys[ j󗽩5ӕ9q6 `@QB*:k~} N9VdzI4Z^P?ի~ͺKpūajTw.T[i0+x8Ƥ$zU(JOJ"] JSL/Fx7׃ YT>NXn76.H"koo[VVJY~MPGMoi1|Цpbp tml04bEÛ.]Ӽa$w3nux͆ `Yn4*Y2 _D?+{ޱ4`2yQm:󍷹Z~?Ib~]ehKucco`Ab 4>PB1TTўjo z_W=$,q827&/&rb;OCwu\gR(Ǽޭu{4&^b:$N O깃`n6j-t'턟V#;5=+pvGٸlUF5,t;]ĩ}Iy_?],8 Ytf!~`%k޲>b$#$98DK:b'&9Kv[68ە؉p.ǚ.SrąQʾJ\(b<—b(TM-٩7ӌf5cBI~d%hdx Xx.GCn'5使gdu$j[ZS*G3|&,9<9?bM~ϓ ^6PHo6pƩ˷aJN&Ϩ2ygT8~ Q\J(wuH˻Ɔe\x>넢hlho).#λ( $h/ZVb:O0ah>ҖzZ_FxGuzݚ?//1K^L 8e%&+؞~F.To 57j6}J ůMS9Ulp:-r 6;|T:qCOQf\0:[UmWiw<Rn&I{ &Ĩ;qN$}]sęW'&% ĬMrora/N{uǥfAy2FYR}tD3H_o @|&Wz 89HAyF[#1>NF(-PL~`Ϗ7'J9lୂ0,WN=T'XJi>0'P}ʗ^tM-A<$9(9~r3 S>'6e$]!X): w.ᐈVp_ mwB mIy.DW=a"ćڙBKoe1i0)HR<)6sZ|AZ!嫘l:&ՅF9P8}r#\P_:TM)iK^x#M5< $w rYgG;L`i=@G1iu5}:t" AI p.(Q4h(j]K/gRA%KcP> Xb/l`|v;2LRuj܁ d،f-Jҕiܡh\ $CaRgY0P>= %W Ao~ c8)Ԕs_iT,ݞuSpYq7҇l^TI ze.arjgNUͧYDl;Ԧzw.phϿTS/!dÎ}St Hm 2]' j,G0vF|PE7cl-/Y)l@ $(IR|fi) )G!JTr"0*Q%9QMaY\+ۊvّu˭^0;݉+0r3І*x8rzCVY,1n}*,L-!(R1\5C  KbhZb&;B!Ѧbm}G(3i 1!f*"fvT tj(z!1SC;Ug&wx"Zo;g4wa&7'\*S«P2͇bgֿDp30ށ~d.U0O>8WᏺWī#s!} MmX&cNl@9M݅+g͟e#v Btla0M,LGvQcr;& (vJAXBĦo%VC& Öyۑ" _[ҍjsgjc]vqΎ_Hu9G`_cyyVW'Hnh.!+\B`TfcRyHڍ[&ISθ9KS\ 횇zˎ2VT" c;Ɓu.뤭+GrŻn \*:Nx\PBwG'Q+ĝyyqťޏv 4Uޓ%`cU̜BBF4Fn0S,^ŵr35*J㜦 ET$@atKC&薄KP9[,BN-EV76eSTT)*_&&(i5EX{_r5ᗅ0J({rᓄf*n.(7&'p.[-d2]$yUF&΍N^\P А菁)6}.XhaOLtZsfFͫA"9ɣ`9ft0ڗ@ȯeZmRDxL_ a%ߓQͣѴU#eiqٷ).{O9%5_eDxX7%7iʅ65IWoF))o'dDB"Cg؝4RJv`*h[0+t$?+22`Tx9Y|6A\CHs&USS'q(@x;-h~یجlY%[,1N?<>8G"/ 6?њW|Nt,;xt|xʒ}#+_^6ʑIYk<4}:0৏}h?[_LyV^7e $,(R"bm%ÙD? CoySՑkr~}_a깞/bb{kuTd7יJ xO(<~) fPFٍ4y.UJ_ 0̠@K>Gv Y!̏tXGj3SsLi߲K4sXE^X"j_-iT8eC p(mP'4"$s^QY)DT]XF6XgJ.U;FծQ&sÈF"1 TJ \*yL!|>m;pӪk szrHD9R&rBX*\P3@e{ LipbkU:Ƴ Qթ$6WlysWuP,U*B$hOm*P@Yb5+ ۱HY94P8GBĻ7t[!f ]طf9kU!ѡZZ5gXpAP$A$BP$`-9rTg&ڧI'\c=D: 7ہ(IDogJzG_18@s_s`>=4*˒,KկI.S)LJUd0/NlR hfBJjKn57 ™R(h햾W3v|+>RJ.'/) طm\b)S1ב5n|]̗QZظU(b+DoBqU̔@6`n?"b R۳,NZƆ}MRp4ZN5?ξ^ QǍ1?O @&/$Ƙrn-f(mH:ęB/9#RjOTDn DjY/zL͉S*AP>[h'1tuש /K[r1VY 2tdc=*rEg9?KD8Fo?SUC~ n;K`Ij mTXS's1&Ge.Dקj8* rc?^, w\3@iZ$.QOS19l `GxO @c|eASMMY. 5;|I.?ri`{o 84-Q\\Ÿ/-ꝐEq/4`Zsd$hl3C5_wlHq5!\< p}9PݵbXJc3-VT wҕI (hy8۽|m33hPe9qPa aS/B G@cmUM 3i64|fUʕ'fM[޴lJ0kض}rƯ2jCW{DylW- [@흒f~0ԚɊO2b" =8}lTx+1;)@n u@废:0Sȋ檓=^y) 0yiԠc` H'c7i{KA(#F>`IAWaHtɑU YY}6YR6*v"7itb S2vbQ\l'ETA!jT%p{loö8~疆]RAaCP^Rc Gs=M>3'YxF*`ij1 rAh0#R\. ,ENjIF<։r sU_ڌBR[ 4FbA*y28}D/legMmI6{9^ DdLl@ʇߣJ4Mu3$:Wu1gB 7bdHsx )65xϛټ#Ö^"o[Ɣ$7qf׆տ;<,w,כ";us1-G\͐l3W6 VjcñP$R{1h^Սh)깜xN{!U27zx~57lwf|K()_jhhŃrh44=h8ԆӲW ?G IR˹0jouPmFYsdd'Noў?&g\O֛kɶŰ-yv^2»J1e#lY|]̗QZ e} ?^5Le nM4ǔlËT ޢi)=sۅUʖjaYB,tLڡs>2BNx nSc.Cb+E'<>@hHz!+0 bpx\m S'>Q8IUh*KcL|9&~R}bP9wU4S#ɰVeuP6:\bxkCs{C՝B.fVǗ%KG^M} .Oj,(|i8^.zڡ||ѡ|t&1D{޻ Nձ|^l&ʝ$X1!0K>ܖ[%1KfYG.BSQZ> تh2Dp.uڷ˷Ţh{RtQ/E>JZ y9(VRIYs#jq<$UR9OPbv(PiB}$FY 4ebmWQ{7Un?&tҗNpZ3C O̘@EM^^gP-ﴷx:! !V!$&`]h_lB[0̗6An/'ńuWIM`Z]9',Ø/ogprGYe_`٢R;;ʅ c/]֚Wb)P%75{K*'ܠԑ.y\aԗ#1p9?|@ q 3oVKNx딘|h$JscD^hͭ͸ɒLK۶pB9}鄢suQ=sO*a&}r \Ih`3ԧm8|.'l8n`ǎ@p%_puULp!2Zd_Zd_.SN af& "W' |㫻1YLA@A#t 2q.#1$H}U92~usBYut(N|seȾ{Bӡrk/d5Ppd,"!{c;.i]c g.Wr [$﫭d ;!TzWKeqg"lIo)L\}%,:sz[aSUb<@#xQ:ٱ66F%XO* \± }F(2EQsshlDίp Bp߉3J $7k KNc⽖a;NԜ?(;xx|BIl8^sRAY=Qg/@l1>.zCmk@1M cMH*P8n yjcm9437TO2WYΟ/-J_8ixOT+#WpV [b0]#{Z1c&ljCXv ;e8PݵdRI '!m([Q({4An<*z1KW' Pm"4Z b^t71^ N1د~}%,úhҽ@cLpPsT_3r|uvrbYz=M7L_P2nP[n +oZ9@#L!RLUYϟZ RIKTT2hc_7E .1#K}./ >sNˬGeB:Ze]S _@3L/F = U UcJfieÕA8vC! )l˞Pk]h( X$i5mynb(``2ʔu)%e?r7YIP:/7mФ='d]PƊiZ h S#eP\iY ݍGZ/_akwdrOs JK"|JHTӲRɝZddʲ¨_6ngdigˮ`jeWWF7 }=wQ\Nh8[{^Yq3W#l󞞠"ru뗛bN_{.{?V qTHy_.Ѳ| n=mmS*MtI*s4sEX !9ZLa:f4zma43-@YVB `Һ(>L|_۾ߣM^>,2 `I_nk*&U8k[ٴyAh>nyJ>ylXIӴ[΁ e괈8R]h?w+6]tw viS<+G5R`eI~L;\c CASuͲ9i~[FVGFh4 tku湸}gEg)*wJU=?'XtgȁsCl6|YAKHQVcskpmSeR`RKG4U\7˷i7 K'bAR pAQC郅9{}9VAc]jUAy FA@~̛s'WsqszRM ".FQdX,8A5JV27ʆ?kRu5G>ۓ/cqfTM#WMxI%7Uy]#RX\1= p-ls*:8<>l !1^Z|'v|006mkQ@a[oBxKpMhgpD+|N"FvSО=?[9FlAШ_n!]tދ']&:ЛPk+ٯg8킣Qkq3O)K=n匵Es%;1h)N ʧjl 7M5v8an袑l;7_cHNNH!XԝV$Ypva(߁D&Jh$z2 bvviDpd Qӵ/"+Yq6hl('(Xb a )%DG/Ay~"k [uBh>[S8N`&aO5kiZaJP i[C|m%R9pFy =,O5JtWHWTxVV=<<\ay\F^@VeUC_>SJ׻ˁdVWSm@Stvn{judޓ6W/> &;/_"~x7 M[cY%辶e"zv1cYPGB۷0Y HF%*[!P}*[!luي@*o Mؕl\itd^>yOLd잒KY68ө@OP>jFYdX`x[kMwUie.,V$PqpzJP?̘IeU!aZ.Ʌ!*+4_Z+"h6(e8` Hw\qŃaJ͡ =}B]8 P$K_N/=Voh0֮uGG.ڜé,auԆYobD3.C|r{;Wx*H/wAd8_!gL" 3 C3n\s5TUzo 2pd ASZ1H \F"9[+gqr,]t#S%Kgë|A pZCFܐ qsaLW G@_2ր}6,b%QD)rCLF3%ubx780Nl$ƿ[VC۶ )<~dZ:9>u#.-r&⾊R#9V9lʑ|o5tHQKHCKTW41ܸu9S8 ƍjiYo |9/W+Vz(HT+UߩZLWv"(Ԇ$~7i?R $aA `@K2eʸ,hʘ,2Sga=+d& =__}Y}`qm04 Rمi[ɸHn1ų0Lg֧orpFi!x<_=*G-ג^|'l4|Q6 u^ޗTqFhD e%Whuҽӷ@9uXqεkQ$]D"rqG횸Rg~ܴꕸC$uc\xH~ՐBieDduc 9K٢ p]\i=_If*7}bot;:`#k)x=,puۻPݍlFjE"EYoUd|:gC(<_|tS?/6m, x.HA#W.'!;"K*YRc\1rc-;eCՖGیB z fڶ hr\v;ilp 09- mp}7zu1DC|On:L$t ~ "/nv1ہ??/]xȅ7d0No.lP!fUA%qg]$u71PZĨc ~Q=x6EgIoR>.{:˳?B_ s [apQyϳQRw'.;wy\cLiarC9*ĵ`_R*cٚ]}ٿ8Y9zБ >M~٥iZ%i!8](bXv! |k9cqw[ߍ[ dEJOQHE4Dk""ݝKU2=xVBSp%='&{7N$)N륳pg-8;ށ"j+[ˠ| E+OMw*I MMR>I8\Hq<)Mne:S.*diNR0t]ƪ\m1[z:zd<zuk 1hB1!j[nĒjьIOByCyqr5VttAݢ !$izD+-h˾[1$Iխzo>PG+GwOJa Q%ύH N9)%2X2ŏ_ +砕-E~!_˗Uh-W]oO}<ƍ;K XFQf:k0V 4ɘXJ2rWJ`bHv*-`,YX5~X 3 #],`^{wؠ63N &DE@ RH:\ KqX2pjΗe4zMcȸYlm`7`cڰ + &I43Ђ2@$8Zfm9Koӕˤ+DϨUmƢ!b Fq.MUE!ԥv8/ZU{eqּj}4?,o J?~?~l`iS 7MA+b4nf~cw|fQvr2bIC{4¡ 2pTa`wdm6y8&Olj y[TV&cLK"OqAId ^KƅLcH!u &7Y!`vr{5) /1hzuY~kPӒ0g1G#*ꋡ&jpoX8>Q DyScwpFOgfuֶQ\6LMΙsL(99N NBD3?4D:E50ֽRƳ(Zeوp^xd&6uE1Yӵl¹Ņ"W@T$88NHJi`l,4<ͮsЄȼ]]5p)T =GY *"s + A`,yܮ+B)@~e{ՀGkEp0"j>oӀ@\#oFMTE9h,w3BބTedR9f-H0Jh|A t}S{-ř2ĵw,h-}r6`Jy`Wmm.c}<~dtrkCL[˟BUvZvl86I)G)vg*o6q + 8|l)[5_Q5,, :#$$nh!c<: 7ހ؎F.n:,>i<|S>qvEB9jhYF诓W(ǫe§[r5?Gx U(hA!غe@{ elWFxl' ojǯ>uD5.%#:{8a%@v<],$JKv: oїD%9Q8q,Y*V}f")*H֜Au&Jh$frSN [m:4Je\g7Z:aR"׊"0tdcXd9@S%r y"1@ԦSNG ='LZ8/!b(SH3Ttv# ԹE_43tawt|CQU@ 3xSp$, NeԊK+$B"Y )u7݋M75V]6WI(:Q&9faP!]ҦB+}Nl9_3zSxv5"U[ʐ >98B+C6Kj LT̃XVjn2FfJ|9cR&?a'') &.^O鏧=dscpBm3ivtwi /0yIخmY3' DazqrM>̉4;].]>gٜ'h'ף9;)}Ak29es=!.2bAJwA(5>U:yU5D5ꨘ$dـixX+@RS q6 ]+13fޑNe ""(,D! ;LyMgi ~:t%V`T#h,;M#Sp!˻ ;{IQ R2A * o>3ܛ2hN2q)QmfStQV*kpxʜ!Ϲ:bSN3#x;u%^odæ7 vN*g+"m,HZG_h 42zz#lӷ#(m\i 곑{i)w.oB7`z]UJH),Y:,:chP(A7q/@Mٴ Nvd.LRgC 2`dq\7¾z8v $+ҼI$ (7t&tgmԾFףc+yKF<; E2Rr ;2g"k['ȎȋpY= ,/+rxJAIAQ1-Le6N2*]^Oo te'X[']̭ j H{GL[~dm2Tx[!s{m#m(;r"glHP1Ѯ<&:ܰ3|V5$!4hE:/=bFu+OaࢭN[C\"SpEY!m|-GɗD泇+o7WIu0814ү*A^q i#]C{QѸLbSPgVn5n 6JdnqZ "J鏲ob'4(o艌~ F,8%-Zp9M2L7Ƒ:LjژA!hZ"Dv2""-c  nxM)x=0żdqT(t7 V̴l`"zi&ʽʛgݓ|U_j /ҋ4͗ m\]ZmC*dҌ-$ #u;'jwj6Gٚ%͛L i1SVA_ew JEɱ<ìp/N62a?i. c8#iawMoΠ iYqFvr\_,-SdHQI$%KQ՝00{ }y_˙r_VU>%pNO<^))ι{xXEll󓹿a9')'9[&BlMT™ 6&9ͤV0i 4!‰ s) mpvU=,կkm#9Eؗ`} )@l<}]+e]o|p%ZR3eilp}YFJMߢn86TX}k)EBe,-nL_  =xv~vl30]Hj1Amzz_cMG?V/GEƔ/juXR08-Σ`u |²DxT2Ql%Gg1Tbu`DW*-F8ڄj05([>SbOpAg竁:55)$Y[秀{^@io9ADާ꺏L v|ShvQp5̙sY/PFk gu8 jYHiv$ R:Yg?uE3dM>pȈ`c{.b{&AsCN;JwtXhѫ|ϸ!Aql۔kۖ=6L^ n>KfNb5Ln9%IHlpE@,ScN[n8d [ 8\g^P,(e DhoQS Z=o L^>dξ:bBxG@ڭY3lh-jwFtɇ^h{_<&_\d>8_.Sv)8+PNG*jOᤝmU0З/V央>-Z}jo?2)GuB 7S݊or!/ľq4? , {1dYO6/>'vXje~yLr"77@*~}ɍpU$ǿS`g'~ßoνk>S}o:ķu/ڮS%5ٙڒ?|aub{_NO/_wB/z}g1Ji#l'zy#ɑ{W$ׅ>|]AJ!vγ--a-~N;!C&O //V1 ;*{Z>Ғ{ ![V΃%ధ[v7vΦaKQ5>+pgT}1dL4Ćb-/_kzB^]JKkKHrsb59%h)!z -&۪ڞ:K~@bMj#AjktvH!'zUH|KfŐET9"Um( @oN9&Vo艊ꚎJK&oxkUEZRT1곭D)Mz}} &ȪD vCݺʗw[K ZF͛.']矔Z[ XRt/Kj%U‹wUZP0hg[Kڳ(f3;jy粩 5S5]ѵ'¼P,ЮJ=/ṉJł#uVh@1d,ۄv@jU+#7)8.nvM_p<^rmӐ CpOM^i%ָa R`Ȗ NQ ohx=x?*(+H)% -U`u чqDW?-1b/A$湎hUL0e>SG_x7s8|x{!FB xsc&o)r/%%.%`0P``gUnֆ9My@3ʹ?jސߗ)HiF}2}Vגw0 ]EWÔp?GBe/w*w+ [ #J -ocHz@]aOuTH|͓DlвBdZ` Iܾ'yV Jع@..Bim{f3F:VK,R~(?㤡Ď7a ]r#\羖F/\V# @ Y]ȬU›LM0pPp~AZcp|ԧ6dEn'9vDeO > r^Iy2`/d7V,<^M&j~zym"/qi?YAm(&NmxSޟCk߯wK"()X%Xki>_d kg<Jr;s^ FimpaSYT*=mMXXMLcކF~J")0^+6xZBG ot:X4sR!Pfm\/o` >Ysv@?tȼ%h 0TQE`!fRsWӉRm!h$E(P@k!ϽDvKu5U۽yPJj&s1%9 yFE`܃'AQ@y Kj9ޫሯRCwg L٤zfmI9¡)$EB,\ۤǝأQ|yq)]p M9{nƾۜw\Zrj!Pw5[ 6U5:nE1$u%9ԊH.R[*Zc4e%YΠusb=E2ч1K@cx ~FyeZG9%L(m8{sN?Op^P3w{6Q*+W!ʖ~(.zcXaYuSPE+\I2kʶ:ΆbNY2O7zZnQضOJ(]~[;]ߏjׯ=1ɐЗɺ99{V)Hje3ԇ[OI~!! Ôq ,BJw&R U|aZu4Kn m7}71f::s#> 7T0p 6; j#Vѻ=MKJl8Sf*velhnA-|ɰ"(?ᩴ2)bmjqcQgU-RJ`{GRS"=̍v_ƫ@bI\b:jktn* d0Ei7Uclsv^M'*5[+U%C& _sTQEOQcK:kS][oTˎ+hTPU.e$i>0ڊ $ܚh:t:dY{\\vE?OQ@>_έFɻY'"7vŤzzZZ払I( /NNHoVլ!l^~*z!='o?7r,u!p|XkQ螼HueXPPgdS?_=۱jKc;*1jRA!JrH:xCE#וv etG5+x2{x;J.w[JuX5B^bpݲ7}^z`M$:ΙO!d,l}g/4gY1t!ql u+`rbzQÊ2 KeQlIowPTcU=\lsi-s )oPs+ LTL TTjAu@N)2WQ3tȶkHDŽXjfQd Iuf'[yG2H?O?f՛AT_s:z9|i:s4FtgiK'o^h&?dzLr!bgYUG< lP?mL4ך Mɾ$WU1#6 lnRft}fᦨXzgz'sLUcz4aoM6@7ܣh}7TT ߴHU-ZuMi678t{ {@u|MSGR |`Usn']32%w[|75%~äw'ʗ 1?amH΢ I^>'{=9ZÛOj f*Fդo}&dTbDW$z0 .QId+K2ݡ@ )5=Y&J7.ɻKŜΜ6n oӓݕAnǿ{q^HV'*OVA0W~`}("OJ1r؜>~a0դ @X9̙zC(B+$1H$W"(Iv=:FKܛ{G~]C{jU7>f}H_\1Hfy':=jFQZ&*Q]$6{Drˡi[HVlާc%㖡d{Y:=3#ڭ|?a#?QGb}qtVFvY,_jQc\wGu8ShPSt8; y$XGٷ#/xǣ:xh'alhc {&>x`t@o ` ár 59Twc&I]vs1[w EFxe"nt@]mpnv mA; ž1 v-n}'Ps݈h?8 aHw u,{)Y%,t_2}F l}\4y7K4)zur/[ŪΎ[j`-V'_P_7GlfјGݕ 7#UR5N>M+ҁS{=ȽϦ}SΩls7s-\.ү<ֺ4Ҥ滥HƲ iqPuK#Ewt(ˎ^aF9|ͫRyf?wjF&'&(x&ISʩI\SkiDTS!*;A $.rx:Kymu`K X0'4#@|UCjoqH zPjHHAOܓXou,aJtK=fN݉لVFoK&:;*OeSl^rn1t!VA42AGǹD:P[lF":I$fsA5WeO!«DLInz{XGݰЪݖy]1&2PTad8R2/uΗnZeEuC[ĕb=vƒTdd~^ʨ-6U\H'=F:R4 jU<^QF5XUn͑BW[STw,)z=b.&T}Y3dֈrM|b6b:u5}DKȥHd,D:NbQSV^Z݈Jҋ2o>9fWrW(7+E+4 =0T@nYO6zmaޑ洰ˀ.{!G''n2XQ8ϪZzقKvui}\w? $ 5uc!>Vޟ40j${Qoʗ;Jwݪ ؚQVS Zճ?q;=QYy^W81/5)ZЁdu>~Ϳ*~l 0Nvʜ9v1{;I'"Vs9CF3eWvn,k킂4t xmJ %e `upizyy2~ނEXyY^L~3F+4DlJI 4\1+u)1&ɧetRбT`5A<59 m~::a6?@p>T%FCܔhĚYVgImfmU-ARrV^ξfl8Y/4,x)x2i&͚npn>?9\w +;JEU6>ʫeI&&GeTCheRV\BJCԈFAu׿!7sPUp̹&wv;65NRh.DWkocd)Cqa (nd\k/||7X*3WE&z9b~dن4vѻFceP[F^tӁYUuIhndl1FRoOMCW# ⸁%%%Gp,j,)Yho<Ѳo\J"n'e7&c5/,PL} <mm j<]X` Ҋ F W(uQ 5b0VAU`Ni㫇LQ'W5Ow1[jVcL$z2^ejfiAkȶZ* !)~w[s춟ݿrQQB\ȼ]>MB"ə< }j[cƱE9yw-:>F"&=Ħ/);fE_V+vcdce6%cdo5`;pkΣO/<E#02'Y]AVD2j۳M-SsgJ\z{ae킓H8T]Ԉ ~5uNNwz۝31`Beu}PA{0IUԛ;k-~V0[L=8}Jt{tSsdaN;G0AT?S0T\Ҳdb +jO^lQ,#sG|g'YInfts6ݜv7I޹6,Kܜ%!3x[%)?n p'| S>gYgs;l_g ⵨Wf{ی?_|g#m9z'M]V{͇EFq7z0` Xì>wgep TԡP.B=56y}lW7b.Μ{~;ŻNdW5\+ wYӛ1ǚ $Z>Tgk L i9GBL/ulM)mBe0B0-vW(*!B5>v{1 .Kc;ԛ9N:ʸKQьdSVz~VȫӹLMV8A=i;}2q4Td$'6dd}p4~C[ )uQ,"R"ANalZ~|Ǟ5DvRAn`K0< 0JT{c\<(dVH[1TNɦ6Idk'}vM,$2%!ܨFy[{L8]W-`-e*-ztsM3>I-){uL;1qumǃ\ݭŐn$>v1fD+\?M$5ǘqz++T ) )]]+#\*vdT=5H~RIIkk{#]@}^6ضFƯ}Kc96j̒ͽk.VvvIY0 Pkіwg/3* 4m:R/jNjl[f'; Pшpq~TwW`N5Ds7? -S#Ɩ3z}y?*B IB2c@y-ߎ?:yAR CAirPkdhfONbKЂy=8 ($&'8 %{'NA۾n_6(qm sq:P>]8=0Z+\m WW01\svt}KifKJo>~&Ջ)өG[/7͠鎜ӧu׼*5oBsL+XNnc(_Ʈ-g;MN)gMn0<ݯ oGܵ)B{|dn{F3kVoy<գ Iag&jr:u@8p=pb/{Ξ#]Bw3sAG>9Zn&6wKNd)tı`|Fz؍.Ӷ`<v>p}+ṿ1~40iQ 9c G$w^x76} Q-c!F>%N7c8!;G-4k#F^ N!-ry$O)@p8~Ngk蓋=BI6BtaĽUBNʝdf㱮1ڢsJ̫dk25Y@nz(oaQo}26|=ݎy42f\gD̎C{GG$w:?JA "u+qN=ZP$."ܺ܆6fQ:0cQ 9l30z =֨~DrT4s}̦{Kݩ#%Kn< p#չ"Z&j[$g{ @ߚo a\h Dq1QѭmF]OytKQyׯ7LFƕ^W{~kc|gT~?G\CŻG[Gχ@oLjr:4ܝ13GPя@C$x;vͰn!~%[Fۺx~,],\-yDhO\;<L{W}׹Rߗڍo㗡v*3ahRįv6P3١ tFЕڭncE RًHkߖ Ǚ CR7qJqj7jnsxJ.#b܉#4L^J>u$k#ۯکcݖ]5IF+۫XúkS;u&)Q;'f97z[ӣvtQzf!®av9L#Ng 1@Е=!jw~ϰRm'u K"3*O10>^+[~ '\nP"`|BEF-Ju%Z](7uJqpRk&F9{Jfdbe'(F'RĖuzBύS/..oTX9:/?K _N+LX05C:k@ CK:w[T/i!UJXT(%}jY*p=@7)5& $ƃQzbcKGkޝelw+N| c~&]\;m8sMSX AB"/߼:2"<ݕMS))!딝֩M0gȰ9uD\&IA3j͙B) kuTX?qw{|E>wfql?vg[^Z;(SNsriq{]>{:ȥRA^9-L1ģl/.oQb *ѕ^$kFCDWR,*IՅͭYg '1 %XldwK8эk ohĦH's7Q9"q<ѳewJi⇏7{^"ӽR+qC LB j5 B5Z% oQ;r֮FR[%%41qq2 mDqb| eLZ*qIhF#8f$رR2K Z/ҍ$C,CtI 4NM@uH33 EKY5J+ LF|P7m)(5JdV Zl8ʫ cT-yو OYT M<2J:MZr-;X2xA5X41A3U5UMMؐOcs Fy5b[}Ye)TsbMw-^׆awa-jqf|0L n9YX~~al.ef:Ůŧ;Gez:OdR.Ǟ;h89|2u{,$d+ ,tYIrpU7M-K唎Wɘ芌NBY!J5Uk2-1N!0oa_)(P?FcNҡq,S.`O%"l/FLܠϾ?~Lŏe;kCT7g^-偟y賱)6SLpTjjlb0N#HcR m\{)Au>H|V#۪LkyoJ)k!o.]ucU?]jEwW__kv[Jňy/Ilj5;e>DXa.K㳂@9O1= 7 ` |ί>7gj9!>ElrܺpMp+Еpwы6vKZ4h х&W-Ʉ(C(:Y3AOF 1z=%)IyT`=~Og-_+ӜPtwKY^2YM=lla̤9ʺXͱ%P|,./M9-MQ 7j%VWMf?Mk 7_O|P-TFLDH9RۧlׇoOq kܖpg`[Qp͛Pi zY!Я5cz _ M̾ tp/hbo0l`SEʾX} ny&o5o%1"}#[TP$%l ԘxWTF$'ajU*S{3j &'59}FqVu:.;6OV׼cAm_*f1cQ{z4 `l +"ږ>3S;\a[׼".6cmQhfgԿ`[aDǥ֤O=\fIlG3x@( kRr䆦XS$L= )t 2"CS-EЖB_@~[8H%Me˩5*E,uS.?V3CXq>\vU%^T{=G0xd:cr8`[hߦmmr+C v]f,ߧ3Y: b{IY?aj+YUGI٧DMXՍZ S !Egi ̠8QsJqqF1xHΩGaBb8F39眎>%f0?q1a[׼!H{>0PK݄kbV3+@N&ˬL^B+7beQg Ulpke'Ǟ<,L1-ݪ[:~:H|rKnReh`/.y+탷X !Să''x伿휜^@,&<>9$ٽb 럜6ߪiMi)̺Sp͛.=*Ժ,n4/{d= E)cip4~Sy4p*s=K#.iB&OaNT*.8Ii#&sjjtP+?}c0QDV@xi0٭3mMۢIӶFفm#;$2&H.[ A]Hl&G=7dIўGFޟ'<oonPo[j fiB6z0jΡ4Cusam*>/-֮$mŶ,ZS5nB㡒lP51#%*>CFˇ,|r)Րծ r.RWQe@k`QB cWLyCU>K^Օ늋qL*eS!zMwίYEKEYEn-#t|T XTR*,J4y%m n gxX=u 7G3`ES䏒3j|yLܗ6 "dq~ZVj5:SywUkߜv&=ؑ'٣ /M@bDO~g^RL"$X@PmP姆8<%UC I`%XM|YL '>?[0]aB!<@:>l-VKM?GmsVbK'ό'u $?"3gߩsb䡋yŁQ]vˡ/xNwک:Ĥd sGbKԮ߂16Q=v1|H#!p=GFCv<xfI ۦv<¢:4qJPœcmvD .Mj7!p2~@Ne6bzA0QDB 1=Qv[9V.yVo%j׃oR6OYi9weTsٕ_ &$uz<;Rudz| 'Oo==U#Y{H^0Ƈv) Xa` `툧HO`NfFb'bѰ)<^|c[s DUὫ +y 39FMc e%9?N1Rt,2Kڽ@Uk/4ﻋj$w@ݍe8V9(EY}>V;g?wf?y#T+HD$qܒd G(R(Jl$MD >7c)_m]cH}I䠘c.EH.r S)®P;U@*EX9uX]]تy!Q;o23u'jNbT#"Cv^Wlw{opí*a.Q0G*g]s<3QCv%%jbz#]1{A߇yhKn~?|s׿r1??:QsuO( (d%8e}us«(ɋznFq-w`k?UyLtG-m"OY7(4p gYHCY!75T 9iQ:> `jV ӿInW%v%jD?Tx:#YqvԬxߩ%z@l恷h99HnQ/ E ?G_tͯGkί>7g#RD/7?T}^g\o?,[Jb>*.(QlŨ[#u-b0TdN_*֗J&e$ZKv۽ݮ.#`@+B_b)_aK}sVֿm"S͇|lEKpWڸOm4hWBhu)|^tb @yKط@WzUWřʟn=jf rzϗzWW@LYdR)|aYpVbO }ﵴTogȽ{eGR7Ň,WŜ +s7"zf*dy9wȾsR)jw쥺/Pä03bTbB)Թ|˨MMX]QH }3%2I^-&Uْ q-#ɺi5Y{P}o"j\|&7<G n{yjG;$otqH0ͷ6] }bpKЗu]o$=-z'Z b0$w3+O<xBj}n'd0~BE/=UF0$#?N-4tt*Es,@t8IEULpH_1#yyݿ)1Q9gۑ<ڃb\Yfeayy ZԾ*i\ZGi {7d*b+8lpkAZoE]t  |ûjyQKHy!: $Z5Rc1M2M.l?yl po[Uk[^ɄxbVZ/]@#aFz^]Kfđ3ib+-u`4why?5X=K=wf{Gp/ꇇ~AEo '_OR⊆|CF6$ ,H\UW8,,E|Tk(PR{Z䝌j O- [L|'3Jp*OZr>^/|y8 B%6PڇӫU1xߑrHءЍ?:K0U.QxJ`P,l.9tb9e.X2C-|QvfTl/Qv>s)=*كCYpyI"uyZLi!܌3R綇mTżu1zTQ&*%DZE֠??LheOۜ~xLaBdGk_ip_tGUn <<ʮO޽3‰ wwSy|2QrwT ~=\mQ-? BoV)qqRڊ:6^2}Vߖ=# %J\,7d ,K[cUvF-禋& uoey;oAm58C"0RsvWosVvaez)rlky(dMDwث覼CU7ro:ǺRyӺ׹қ=VL|T8rEG[ZB~o>No1W(V]=v6<_K8t4S 8ܜWx1 5`3p3tfQϰ.9_p-;Brwm_!eNZY6ЧN7(beqx~/FBi Tp^sCq,ПR=$5B5Xו(B6cM쳩-cn5PBDǫ'{W 37"FERk"]3W'(\>.LzH,O~V%RthKM5mk6hF>>ND VgJHhS]enP8/3$(4gZ~جyKRH9 x*EZ7ab bT QxSՍӅ%zgGe^ǗHx|C)~4I 4֧FHFߴ>%8!G%uoA9Zt[e5~|Q܎ #m&^bqݎv+= lÂ3rFjqA]\|+vJTL(wGwwa,}vvV8@pW(qG$gGf好ϫxNZ"2ԡ &!J@TZHüml~>6Lրf<|AB~{)+\ 受cN[`EWL8I Ĭ6 uz@Y vn_Xm Ziai$QR9Kmx¶a®,/ OuQt{q~9[*q1q*B m*M=$ʄrs'2Ng:t6#%'5$!ua!GscW`H~pZ[WI:+f1uўK"80.X/МY׭R;A{ ܴ$ ڈ T&ZFG Ф*RXTlDOGUlزDL5@S }itxz4/090%-$:nJjk=;4`W>4$52D;vp8 I) qx>2ey6{*P)MQ-̙ܕ>_UMIZ<$$FF&:J)@rކ|kS{qȚWb(Qt~tq߄dip(%g8 #;mL~fxMsMR<˚7(wyE C͝l (d$t<•gOptE q$_ \W1JBY*7&ۈsGAr)C',*McGHB4Kt,Dsb3P.S䯋5nGYa!i^EY=~:Dn{i2Hr,{g%_~bP\эcg=C<8C{X] h.ʭ]P8e<|Z1+g`օsў$p8 :n <ғ.=C0jZkW+3$,SQHX*XlRJ1L 1.J2BujhʅCP4ًbj |yv 'z~z%(ׯjѭe4XL+^"r~ tgD85#'AA y1tL5ZMU,B*MUxmQ p!kC0$)F+:e\Vޒ Tޝ }dW|kuxƋm'_">N㏙Ѡڛƍ3BMCnxH\mZgzLW eZiG!Ӭ_W{UrӲKP/CNxg-)y`C’J"L SdWt.嫟BZ*vl.-唿 Ęm!5a ;a<&W(}PaW.\x+oLf9 LJiR~hiф 1yۛЄid39 ]9-?m M~WJ|4"z*XqXBVblG,2lۏnFic{nsGi,ƍT&NpR+ujSi ^ 6A6 j̺$ICx8Hf3/I`pvҭ"9&Qơ67CN 4 J([Ѱp)HH(Ũ .3.IR/-a#+X[R7q:Ot *}6wkM+Yxɔ:f1KcNΉyv*Ej\^Ҥ0ij5p5yZa:dOPr?׷Ũ^h5F-p76R(!4&K8Ĕ8IJ+AJEpWbVZAB\MBx|^&q^7qۤWl=l{2Ã^R |q%|pxE m|ÍchY>;-n2%>ÛG7e\dt|k,{i?a[$FJg\s&v%'ː ޔ~悏p# @/*[7`_}&0XQU0rb~I>ITq*+!NY CĠdpy/levf?$tfb7m\/Rw8q 5Y 9I,'Icd$B4xZR1bm:LJr66(zdv}lL Qg0Z~lԨ5I1GfTMWRbuK 38;%K J]!P[ Pe83*\ݷe_%hNq1KC ~ё5K}~:^ּ@_ߥNGoqB/kQ6\-Y&2PUYZ9`3p^f0wh, \TC/#H}$p1u{1Ϸu{^P*N#_:bHg]d w1 䘚Jy{1rW)tY(?%Npid-\)Gy~1<Rb܄#‘ݵzU7KsNJ]]c^izlҪҘסm̫"$S F>i[QKŨnPu5cRu4 ʈn;N@ A2v)D$谀r֖ovh::/+/[ b􊸁՘`E+JCvZ*s@>A-AƏ-M*0GXtD*Zn8`X˧^VYiimx坆a2daf=,D%\>D۠؀ct&J53b/AMb-u놅8-e4# hB2#Yݔ8Po"7 g9P* #ݽc`V)+AT;yHD{;$Mc6}nF(eb]_\UJx̒(:(\t+eמVR xC-4Zu2xJ*L &,J4UۼHyX-pY42g|}J$D8ݷ̀/q'ԴӞ*~hv/}@/L.5?XLjHb$YYiƗPΟiFJY =dS)Qip~pu|s>pmԩ>]Gx2\Pcؖ͢o҉jh%T֚PH: B)KFR*uRuz献9#碎WD95j'I+|5߂8=JYp6b/]@{1uH]ծLFH-QYق$x5Rk ^h"h""P 10skЊiOD(o` ES^{OŎ#niXHBl| ZW>5M"aTvE5 * W;,:@Ż bm(_o3QC0K4j1`/$6)]^hem{*)5ƣL{v`ڙF a%NB,rZA{Iu7E})X y* }M)[R{ (Ћ0`EED!J buPh`.Y5TidFk%M'T ZF)]U)G_CH F1?(4(KAbP0T =n*Q{_ qo*vJ,1gO(AeJ1EnXmntt[۪ytJ ; ؝JKm^bG)ь=_ߥb\j\㡗B@ 9U}}w䙨673q9U:fo,R5}1Cu,mN(^˓t/QG(ObRe:EΣ%rz|,G팾xw:6P{ƀߡ>z{q.q'M_ozr3w0_ߎF(ʧG)L5 k昽E¦nIF"-\~)hac=`\܀ZB,SK~o{ѱ)P٥℩f~ȩ1*&$e-~K! (;3J>Ua!++dԆIêABSDȠ*sXZ V#q"5ET[tnMpH{eoH6O;zugs)>#}EYjkDb%"zx 2.5 1T UM 4p4,wd "9=*@c^қ]]sڐsڨpY>FoTYIQIHz,0$Y8cX)2&J~whq>"O\'QEIE4B+ 8YyzאT\eWj;PRCsJɃvڙõgZAU~MQZR.$ʈfhБ-)t?^b.2ٝ^(+_|B+/U3[; qV#15 @O.B"/1~*Sڣ1~Eg4oqԖ`(CtN>v|#]0)yz I8A`cɿƗ' Pj{nrFjƋלkQYÿ N+wo#?{1Ē3Wc9PoyCѮ@pEm_}o_=,1s3px^n>2olG8r\J%wMΚ|+ٖG=9ٖf#_ PY[ɷyqkԟ !DDzM,0uGvc\[$C.\x~=*Gbp)xRyg`0HvǓc[?C7JhI?(f'kU#MASŸuZ p&i 'Vx԰~b fCm& =ӄkVVFHZkAxM2XԺ֬>X (r=3@^ɷ;_&EEoENVgW09ܩ;l?]|Y/6[pq`sܔGI˾n@#oGމZZS)42>)͑lឆSUwDS72`Tj$nYVghz'HH3\dVKw;ȸy:5|t(Ou b'⥻/kmfbb 8+}3-wo "5{­,#@zٍ-V6`{:ؽuj D 0`xNFv3suÌ _7`6jϜlל=7v%>YKjA3:PQK C(yؙ!pJIԕq!>Ao#= ~@1uxPn1%CF$ q*S~8@揷ˢW%%|J0 AU0t5DeU2u-jKk7O;odjR3BP̱0֝^ѫ?Ɇ8j)]춮dImwFݾ4Yr dj&* .HYB"-aNm;{%MgZhb̖Zw,[ PVů"]\ٱVb;~F)c.z,`ܹ' 9~ls?;)8ov2.} VN9,=*J ^9spB4h8 _SVsfPqMhspW6{$O89-oUYٙS YO,W^Z ΐ !;=r|ئun({wATgǤ LfN5 zng*P7Ϥr} %`Z?mʰaO='֌1cӷq~Z!I CD:ԭ\l2%abyP9hbyh>t ].[M@z# 2?1[-(v؅x[ɉP Q !3.zkJ\#GK+y; , }\^O]{s6*m37LgM&I۹d2|\KJR~7$%Q/ IYq؝E"q888t :o{Oߒߚߨ A A1VV7Ỷ7M*6G;.gaY}X+DžlhdZ<( P9F!Sei Z *,縠2Y}y8Wxn>ȑcqp5y=@Ϣ-PuAAQpCӰ )xVNg)DN 3X}" , Hi}~D *lHhI@Z'C J!Wbo0k)P*dd6naC=ETʍhlu ƒT%- Uǿ2$b 9PS_6iCzNT5lxQnպ Ѯ*Ѝ1s{Q?+gIz–#L˳aR bqZԑ2E>6'ȮBΓ7o5;s%*xWfQ+_623j:)Uy2ƙ~2W P`lC@9  !GGh&`*۰ﵒYAaۼJ/^\GMSͳu)m^Nv0@[G BsW8: R\T_FixlHl/NRU^; )pؑN8yHě=+}/GgĒ(Hg= Xa4R ZQ?M4/F?`XqZ`3M<*2y'jJ;.2O^VSg{]~pKzk)a" 6VAk/sz֯1Un$d'R"䵴CdN ⾂w'gU<0)ՀR8qgўg;ky:E[]۷Z~0@]%?9<,xR^ynFe0޿~UYTխMU;:kw8~UƌCgut.3Άnر_e+}ZN%)»'I&3Y4 bדD$rȯ?~ ?K C:,ހXp9 /ë8(*-|vz[GL={dAp3~ߗR[\y<_V"c&ΗbK&C_H1@0fHeH sh8;z%I|(:/FSL.U6[d+,PZK%TF1/% 2v6I' &y/9!j+KlOC&qthUSY>!| 7@mpi o vC pB&41#4#]ƌknARV-ClfKe*gB*W%Z[7ʱΊ^:;KDq&H *-A," Q;h1K 9gmw`\ _;j!b9ܡ՘`*S?q]U}5۟J :L=ޤ  peDi6a,:X~jOUn_]q7쮑5(p3T+2BS'Qx`8aO5Wk?G/d,r憱3 %Zaܣf-oDYu1,+XA<Ѭ (Y(z|]Q"LDpr/"$wT&8;K}<ovҫi9xGe0 9וUz_w_Nq־I'{›N坕Bǚ#A+};Tׇ){ﳀA ڍ9pz"rnZB 92y"#AT< N&96 Ҿە[qZh-)2k {H;܀cd+x>܍snDCt WݨQ>Z}=`Rd T?p,6ُCT)GgEeNʿU6}heQ>8Ae®@ &HJW<.s֜Z5B@ EţY[t^1qsiF%5{kw,<80!0&u7@m @L` ~c.STu!mz)Alöf#M=91,3)IJs;?l9fiR}4MPvV^9`&t+Rf@Y[f_[(EmR ϶%c LoO?k>~Zޥ]V-XX=UYt?7vνQǡ!yg!_<X?pT,feUer/&jдÍ"ZneFBeeGzpc horf.tOlKS8Ǡؠ:M4gZh:i<8Q]ۺnsiC PoHKGYs0Ƥ!iۆOYrf5=fӻ9{RU('u:&ۢ75ñzNT,e_k$Qv Sv)}(8UBQ:B91sșah[jptqLpLl!ôM<CBeXFVI06/'%26@/IBQDЛx/bZ?ݶp۫\:@Zm!H brnjj <<,'A+7 ? / =T>J6Z*, K-e5FHVm& ZA]p ^Y=k d)s[axg% I'&gEe~c JQqRo}yU^B׫:OFdJ)feJnt^*96 A'Ovh-J .JJI}gņLKA95(xc FDu^ua5| ~(^=8<,xR_ynFe0޿~UYrZd0JјqH5)7LX8û.bf~iop:IosRhҫ?P b$N8R~l29FaTlڠ5ږs1 TL?c@ ]ecՏű3jP!) -ytv/u|/m< Tgb5iC&c¬/(՚&ג*JN~הr6MGz ( x![ʝ/5SNuHBHj/rGky#)tmb≂ҝ3*IbNF^'&ir}%r,^'dK=k\b ;3,ߑ$#4VS8TFv>VbXؘ f%S:7+o?>Lؘ1`ȲYSD N\ RC(d@VqJ?&dtM,3T?yGbw:Ψx $A@>Z ~.;(E}~ׇrP=vՈNa˹)r!;2ǥD*bCb@b>ij[J"qqk-ݜb*_#KDáҊ<fnl]p_Aƴ\u cԃ"A.7\R\n>NՆƛˋ7? (8{pM:@*qw? d'1, KR1g'F2g襏$Z<>Jl>}-hB'5_CxZnlR7ܿ⛂^5RmԨq(ܪ/Bj!PI_^W@b&-i+Ps73(=(ulhۧ͘kXM`AȚ(\T#<\; gDb.͘%iQ')m*ڣ#:N&>i*ڣbQŚ@ 5F[*0fZ@2,9IeT *J- 2Ul% .3^V!z$=1.?9PhG凎ue˘A{1ewP,L6NVgl1JFk9h$"'Ak'[#X9ji=;frZ#5Z#ߑcoF<mpRaig `[F,[ki]2< ]RgT"@%H_OնdZdEy}'vEZdEn"תg߰-*$Gc@970bpBԻIGN!VcTIdYU6"[-r-:dZdEZd /rm/F3jPX6a# cc)hIM/t"5*6L30}(DςilMyq;t%])K`ByB&3@5b4#`E=ܣ$3m=VtC `m3J`BͭrXPfHP4icSv"&sds( o)"}P"7tw"_PdbiyKÞFr/ y99ErN~Q9'sNp 4]]YoI+^f=E}6vK/<KodE/)UEj6A&2+###C!TG-WDH2$F>Oj/]VYt~گkoAՓ`|1M BfC2/Y$o/?XLwx]rF^RuSj"ƛJqr>{bF#ªSbZ Xob2 G.{gNf  tWݲǪ@[]Z 5Snq DHD:\Lr%bYI DX8!*"aL $2;o#˦ @MBMZlB;p@,DLc:9sNGD ZsdG uri]+%|w >b3X1):-X ӑEs Jha;,r 9 i @N}lRw_u 2l CT$bI tP2(G5;/ Ƃ.b|V8ƙm "*9a[J L ci: `f4 D)V jJyєbe`( )vJ8]3Ns1 weD^;I`A7eہDQK 8<.ʿB_7f$7%0>MpknZHH*7W?*XӋBM#a~R\gv5% &LE>Vw4ږůpiAv ֯t| R׎=A-spX7|9͋:NhC\ob_rɾo滲Z޲j .1zyI|F'qu6,)>oZsUW[޽MRq('ڈgwVLH 5qgryt&Ozݳҵ+Ex*5'+R6 K?iõw<+뻷i*Zmi#'@=X4Px(" "xQjID\{ц6Bճ/{G3ս7?=Tw-j3kYuwa#K:M*ٽ BoxmL6$ℷA)\*5ݱ Jpm^Cr Ƃ#Mt{4кv2.kt~|R]U'>W~H}L?PITgJњs(,[.0h.e]iv7`NR<=8e9A^øucn0{IZsPG6tsOZ^S;&5$3}522_ ހ3^6+E43R!7W^/5W!⯃?oz#[1IG$)X{\1)/:xK{6ef yrDdnqՈKV[zk{6 [{C:wO!vP閣ZP`|[bYQ6LIR2FĚLrTX:2GtyQ9*S-$P9C?%xyp#@3qO#z>F7a3AjGͳzUXuNe}sd~ ;g!,% j< ^0fkF=NY$ Yt{,Wg vo+gB O&hxb֎.o~{ bD%1$k)ݦz;'2J S}G)5iO҉TQg"<JrN 6r~Wy÷KAkG2׍Ac 织9?y¬cŭ̇xSc _r,"5Wu$ zJ7$WGd*9 =8eȩ}Ct,,|k6,|P*w\ѝQ؈gN*b4n7^ΏW(,/뼰ܬ+o-Ʊћ&LcҦ][θ< &|!jɆ1 tud-;kϧTwO)8JWG:"8 ltDxX>8Z觨>(u^+}Wp~Z[яj7||x m^sOq 9[>TVU":1L'8(akEzd_,_"UXdx|&2x9XKA:jc+tº[矶?df̞`DikqpI\lZ{|*v0cTYTLcFaxζdDԖwk"N`qrI[CZoE6 n\L#O^4QAu4Ҡ!RSm TmuKm*e~1`Ҥ|a5kusq6x(Q i#Z)!V!$kU+'8rn^DOp,/J^[6Z qRɓ-m@BJ\-4u{ۺЄt$UC ZHk`U bGb @ͰJ<sloxJ k,JWgͲNt퐦W bw<NRu岸2M(ab:YSn;AgkhO9Ι%)?RcmQ]Fu8;F[~=@3[6٢pd>CLҏs׬]g{7G ';0Zv:]1A.:T&Q*E:+y.2 "u&:mZ+tGX *rj*ZL`.W?9shB 8`sw;o| {1s,]<^Jd1oWm(V%toK\շn]+!50μBX˔sj6/aW€s䵵4K0~<䗉wEY+,)Ü! @?W!Iq&x@uWkaN\ެwTQ(`);b(tTpX9̓h˰K7 t|DD~RQǥ3oJP5C9BlD6PnYk ;us:`l<**0 xdybȐ6'5Bڜc,zllqcL \`/9K[Q J@Zd5k1AHݺ3qit~X=\7mDCԼ0p)Dŀh.3InH!'\A PCHFGƌ*m1a6N9N3(FH(>4 rĕPs`\X僱38{g7{`LEGGUOv6p t*$QG ,h2 GaIrHv8C c% $CryZHAVY}ZZK ]0Leܳpma}H}?݅,[Uf9zU+,eosi9y"R%݀Uu^]+P__^ h:/mՎE7@=f3Ybbq2#hfrX #?"2c\m7psnyi9/fV$’~uf{7O[Nen.u(ׄR2hYA +Tia߬$~_ sqHHn+L4tm$#F ~|$IC{g q6hw^iNyok;AJ4ft1j- p[>lۋF*s~%MN|nTi9u/m/93.﨟vzrjk>2yI*j2'2oXdD_C()O9Kc[(o0pO[:8W"HMUweL}2&ו06"֕0v-cF2&U\3Sgw\I3k<|̀8:9R PF%X f+R3 ZũDXqo f(r$ /*AaXHJ4yX "͂t>)t%A0GB,^mYpz,ѽY`  4oV!8<<8F qBQ 0l/xH*y|f ZgH' >)%Nηu 7XwV_wNhא3_|[,̓>49KqT ;+gOeWsj5;]3OnTQ ~+FMS1Y&8.MuV.rr&LŨ[-3YuͶ^SdLY!R bd^"UY@yq=0Ļ䠺;pYD`^s@idgh6u$lh^/ O?M: MBz'wOtnt>mm8M"Eg+ZI_R眾؛h@fu>Ii>}>~ثЯ5^żyBy%ȣ۹]$h2ikluL !uwdoI:Y=9ǻL3Fu {{:+3-E|-c񰉶>>(=~Z^Oj߶t_}G}ա齃[T\sa纽^o\u4P'Zh#/j?һS: l֯ҏ KJ9&|$tc1W|:k~KT .x* *ZLx}T8+i8x.-)ãuAQR:NoG#Co_MT(u;&*LEX"fqЎocȝ|7$*D(}ZСE[DGg ?*(]4y b4E'XZgyJ!,džAJg9 O#0q-"ؖC`ͦס&5$`׉QMe}T}w 7GKO伧~t\۲x㿭좚ª7&}4M系v<Uۛ%~='X8)uM: /Ƕ$ r(%ℚQƨĠQ тh.չմiVr.qvOJ±*DCTz\<=NjIEG7((- 'X?{ AJ@uM|չ1E뱠S Ɲ1<bL\(w]wT Ř^l)mFnQa{䝜Dܴ|Zi)@cD5'۷ױDu΄Yױ^݁$&I>{Wa XB8>hu}ijmRC_~)],&_L#Na< zׇl>쪫;踔,GHA[C)O:5 jkڡ[+9/(6t 鋪jX&0E+R!6 Cm\wOL)h*4uDx]QKA:ކx" sxRA︄ŭr'\%*^~]{nv>CKŹ1W3l eX4+1RY"2#[9I1 2O?Gv_5]ڪGserPŻL co_}) ;y?#um ?~ŸFE5'S@9p[]XgzrzT f}#-94}Rtclˍaa6" 3KE~9Y<ލ@0J<au91BL=y "lSu6F[S" $p-gjѽY68P,G!@:֋$FόQ|{\L#5e>,g7FJD oP4C 3JT`;@qqϝɆΪ;wNޙl:Ҡݓ  4JW~caIE^Z+OCYQ䔤(%FaEP'eN^]jks\Itj#Ta4.ga0d4;8ۖK ?]p"'1Gs6eu^&ezYnzEⴗ#K[/b @q.,'{}g Q F|0f֝ON ps{H3!;h=Un5T1% abgӉ*ys=-#_t!'͋HUNůJvcf`{ԦOwy#;"_Χӥ?C%*Ǘ":n^YM#V!w9e "4aԀfL&8\)ԁspj 4q4ڭⶲW [T'N[+OZs62FYIbZy'Ǝ+&MeL[~fa'OqL Nn)D?vܼtv|@pnQyfs͖ӊ,2q Gz04y\ 7^IkzOP( =p]q_wN׺݋${76^i6OtOզK `sOwnbhۅ,<9y;JILkmpk!""ㆩB)pMV9C VJqJ\7_n@B xnHQˮ:ohIE0 );->@Mhzo F!+Ͱִ׬fz㴨&<{}u %ސ\,FTp$SQqPN7DAk➹)С IR ~Gxt(Af- Avawr·w?kW(G7,o wqE8 %JsVDӢ:,ޤ l4[)Ic6~ݮɕ]^x-өxDrs#bN1Q[MV(cT`VhAsTsiӬ\lڱzL( p:3߯o~.=-V};6 {s[L<<=NjzyhY@41eR$"c" ZhSx2%sS`$&SF)sNS- J5()C%I`80Mq|̤*PD/"/R(MҔtZ$#By`\&tFӉF)q$4Ο i$*L&C`l bE#ZmabE J (9z\#B} BɧUT_hʔTF|e)PHjDy&l$5VIqxqB,᷏{>qV}"bum >W6_~z駟 ~ E Ŭl-c%ܽ ՂpsEȟ[dzwӛеo3p;$sa^ ndx4}8= !gCAq).xL#6&g-E6kы00n>}|k7I!")Rm.ziA"' a[=lcV`ȐY妍IR.x0H`n!RӴn-x4&lԚ5&Qj(m[S9A5egd3 dT\u9f&BYDO3:cP.|UEk+臘?cg6Q rP"}Fy9dĐ;0j-VhK;6 }+k.SuX Id'd%1=peg}WgQBY\pLI3o"yD\ ~xݜ ntpĜ黋ӻ%y]fȆ\! jGf0 wՇʄ)LA>IFmm]^sYKDPCN5'zCHOa&+\&/j~zXa\3A5@9oj3Ԓ炭U+6V(RVy.;ZKVݘ-[ι+q8M/sJR8Orq%~v~pXүCoǯѫbiG͖ԨQ oFtB$Nz(Βb>)zٲM񾰾&ęmLX k/X={qA j]QuE4tYz/1mr@IA742)J^%Ӑ O]Qy PWY,G:.Q^WY$,FWbC"Eފk&Nx8yN AGo$%#,)L'p"iGiA0;q՛5I;ѣRJx$#1J1"Ho)Ll.Ze\A:7W;*@Yvk -^Վ@gvuV@rMqJ`ʵ%`5kch9j4S5< SlٷVywAm9i'K!~>PX+dfYN)%t`YT0:FL+k& 9#EזgsT^!UjiTH>_GF^v`z≘(-Y-$3ؐ#Q6M\D k1ɀ,^ղWcyoWǤHe~oD2:pCGMƍ!HImu_$+$0k^(>֚5+(HbJ>2LMu{GJrMrbU1KA&RdOpa)-G(^QP=y7ÀVDO16G V.^= iZ: F1h-z0&/u8)Բm斯 Jm)qh0IkGoH6PXMףY̝eiΛ_fՂp< ipqU $_GOOZ"a8l]6BBWեyù! o<Լ|v*m;ؐs.ƗU_nIn+r˩3]w6 ڛ#5^>oU'{0.MXڮhq$y1 5 k-`S  Q`,yV6Pq l=H 䒷i1 T*YL%=C(w\up R\q- KBI3,P8 /!UH6{JW{jSXMȖl{ԇhtFH&e2gI3Pژ1b8V^3GI Vʓm>7LIBGK / |BH` {{&ޖS0o>J@Av<fETΑVkL`Rq`B %Ƙ$CP 'l|6Sm&=|\nR7%7i*ɒtgkT2KSP~LO&4jց2P=7o(iP%U :թAڋ 4Ie XHX:C^^l ;SkyBENzIRPu= jS(󒢺nNŊ}{[ ,Mg *L9z''Gs99rٍ0oTj~a0=f\0jujx 8J\*Fo%gR#t?ʿ:_\|qIJQNλ?NU^Ss62,x& Txڥ^#%@<>"F6|!,BC}oTqt#JþU!EVBk}7X-F:]Q|W‡F|jÅ>Aq @αp@m_8΁@X9[qq4mKWNya!qQ܂2&ZvȫLOEfKjjUA6/=܉B#]yH.:DPuN9ZjOؠ-jQVkhzl ]Ч6V0=pQ;眢3,$Ӟ4ED':iqLJʸKn|20o`7u/| xvٞFF f^DkqV{@ǷEZ ytH(&%Z-EY "d3(R1RVa}PmY`=X\3.FLuTg69)Tߞ:MO-1!DE А~>o-#K?_>]A=3-rA4 ECg/zճ)?L.l{9f)}2ja$9A{1[zIBAp.M'vHE?o܋< FW[+@PV^v"u_W ֚[%K Oսo!%րG${[מȯ`8"[D~$YnȰ64(+nBhL[BƂB;2Z5Jdڇ.*+mO* N ȸ<7V rđPB"V/8轪%?g7bܮ3+;q!9c:!/: !.JCY}LSO5Ce O0!מut`>)u\qO? 9^J?+5ɩy8S̹wqvEXr6k)VHXa=Fe3}jn̡N=LV477;4sKRx-ͬBJ:+HYr  bO&I2V-O:4yl[0.(C\L$ޤz4]pMROKDx,wi3[yE2$Yτծ79T^uk 6xewvanMs<]8(,V(YDʂtj8(6"S&w7^w[^ ;Hrk k{߰VaJ8_Au>PQ6U%^d{Ss2x,֪ I q A"Ap0{ZӦC(y!O*+_q:yVjr䤙l =={Q+w=Xs0~kBuuicu| CbPYw2־FSX6%*d&}i-'/mcIʟ2ljPZ=oVV+O&NMp:Zpb}j+"]cvl,E0 rv-ʙ얈"=XX1gs^Խ v~r!FٱZ2hgHBj`rIof*²wyC]crjۊRѝ1v{^H+Ν2{T;eozxe߹Sfsd&^zwzmr/~\ȫ#QoS3:SjvgWpxYu[{7W72Q5I]`͇Bfe-emf3uV1Q=k\^Ū_TPƽX -g!(~Tpqly΃Nm$4E+ؙB%(s3H&m :cHhVz$)Tvrjc~mp=5z_? #gINJX|e{KV쳮ې纄wkA ΌQu*Ip*$zaJSux Wqzx+Ս1N~u_A gd ym|ν^THO/T_ ~קK)doRNt:Wm! [V*2=G24_pv'~§z}3tKۮzڄ0*k8 d"{e}16sLJĴ< " Ndc hJuJe$ťL*jʢ 7'a [ MŅAY/..`gXŠ04OW.6L Wl!Vڻ) "Bom^4d@ /W 7Wc)T 2'WK 8pM7n.dd~-޾UU@uˤ܆fKPlZ9)UY)7JU-8%!aJ' $=zRĠWD;rz޿^wt| /A(mmC4ݬ";nw9Nͪ_ 2&~2Ք]kCh<>.?sJ.UZQZ#Vbi<2Q7CGF-HXjz0*/dTv CJJ0z~*7\IOm 7qWMJd|>s-k/}:yi 0B.[(\NPtqf㮆jXGFt$YE-$sj""x}yV 娠*ncC3%ΎlZНyeP_mF@# ddxH5#B$xȩj1[)Liw FI2aܯoyy0J8M{THDy҈2jyD$F$G#"9"a1C;;7ra{rN5\1SՃ) ?Z[-63r௚7+SpU,mҫ(MMǷoߞԿWaHG6\_}uu~-߾JWGpySU_ZvW_.M  'WmvƷ3oyy 3bvq 5^Y^{>KtFӘ? ק9ukbz?=mrLlIUG'id*o'깢Aɬ*Po1G(d- SmS* $粤"Hݜ>?\$1Ib7Q)ƃ֊@-1#KjԦ@vԴ1jQx˜Nh ĖH&{ %x$gQ.# *-#FiT\84L\ND/D FZb%s@4Jb""F6s|b OA%E+2GB$(ZhR9A9+(2<"HR"%TPˢ1i`;$5DAoAf.uDR2C7ԦD2(O:S#u;˝^iT;Vo-mRE/惤@ޑ-cx E Y`h0j!aqHj]B#"b|3 cLNEJiQ[L/Sˮnr2D,  fXTrˤ "j9b*K O0ѵ q)$PW)9 J"r1Zm`5ni=6ڸ@Y栨FGάwRAYvA 4JCo+ H3PMB蠰YKI{HB F(<+[[D tN2jM2X*3JYYG]'1 C`Gk$@r Z.E< &y "A,ӬA}omh%@bCF<D%e(] sk `4Q.KN9K !vhK բXA$Aq^]Du`X#Z`Kb9H a" 6xNSU[q)QLXuj`>E o8NU,,8Ӑ>:6l Ӫۈ!V@SoPP0X+` I NmVuԂ~I%eLH,5BPKFLVgY{Rv)†bЀ)J)"r SĨeBh#Sݢ >oV#CD{ d4xE 2ǒ ,\a2n-]0( |*GBSSF 4C`Br6E,a@1 D*{C*Mx#{ NJV4}FB%+RFÄȘRX ĶL!5"pAir2, {PEF& d$JF5H-q l+"Ʀ@(?+D 7[r&vR i'VV0- f5k'MO@,/&0 C+ rVXhUed^Hz`x -ջ=ٿf\·AA<ĴF9uk`/ @4\郐=d%婚%D /*K?"n a\s%( >B<0H P ܯIZ2k(&Ci\;.Dkvzf)s.z%ĖvB#G.*4 )Ü(E^ `aV:LAwV# ͼH'ZHpF`J"98q-Ē[Ptˆ (`ۉ- Դ@ B)l+E#E  ٘x&R”Hgۉ O`gsdxހ!#2{WƑ ͭH ^"eMF~=,z{,”c4ECΈl'%U]m}4F7= `jWX}TUd*X K/m SBF,H)d8.z0F[WV;g-~V* CHx2^z m +[ŌHH(cx eA* t=**, j$ft[2` kL` %0 DVaX:-#`ԁF =+) \mE' \x 2TR s+ 8AgD׻i)Am`JE2a yuV=pK19yipIg2 kàSjȁ+7!_L#/T&^\Z`RcK@7YJƝfPk— ہKYHp" )5/WVq ?:7lǿV$2].cpP*s~CC0F'}8w\8¬bN[H4o!d&p#k۸;NŢ;b% YX٧d5fn(v?ֱ:M07eKߗcTѼPekt$#NE[E'JZ}k07B)[ل'GKF{2AeG (L+mr #F r DWv'8:%3X5XaK =됆U:Y\?PLqM>͑V3U'<Tp{s!J?}>w-'8f; V,߼E|}{-yHƤ_w5<-~< ,#=BIuk$ +wk$zbҰ2wk" ^\ %.Tq mSGtn)ajCk$V0Q.f{d+6' Y=!mn&Cs34B5Id Ղ< P(w98n<#8dXx#P`Ƒ*{Xx'˸nsiB0p2io֪fh[WO>,o`i?K?J1.DoDF\MQ$F:O//>̦bX ss,<^o0~3 ?mnm[\ `q]]s=}J&$:xggI5%W3"Ӱ5 D PԄ[@$cL.LT\ʨ aU| ;7.6n;Va\i/g}Ůe/L ;3 7 i!Vښڜ#v}ZyogW'=M}u?nzZ%AU>}瓸CORm1:u~y^*]/sZVg3l,UWRBt^#zi֮U*4IXZ WI:GZ&JW7W2CX9 |Gy2ժ}T!7u.ޗ'o7=|p3'I|7Y,矆+^縸.C1ҫy<̮W˗}a6o/e|.98KB$=6YM.tc|bGY1ԪB^-'*H ff $ͮ.hS\ͥ7 f!1O6Z.F84Mdd%2Fv%~9p<׹0KFJes 92i&9L^fZbJzYĀVCJNNʃ#kuf't>eff֯٤;)Ik{&en6Ln6ZI{O|}B<5l,#sK76G 6&5)$!C{4FjSSI#F #B17ٻU^b8.|p -^/j^wt>p-myߌo"XT[=2Zw-a#).%R.y)C!t+ h NDIS$UvZ~; %=``w=UH~+uL!v&?W'W$E|W0'x[:<|~9d5A!M?هx=Q[Z rGV Zh$`ecy0"jcE!w{Q8Y0S5{uIkyIeGn)J! 1Hs҆pd,Uѐ "F! g2V@ ?cֹqx!#7`"wyHj7ih>?XvM1 MrUu5K(^ɧ[w߷_߇EܙqJjD88m%(u+ ۢ0%9_T*Q(,C:c)U]! !7F?+};{*~9`(\k!viR\4Qxnhq贑KrFI@m'x)!yujBSμ'_m,̗x&'k J_ZScc"Ҧ>Z1` ,1އr[ql>Yl_<4^\ ,(v.W_ŗ͎3#\/+M *B, 3D^P! #a"^3,!:2^x+ҋF%{ˇc.U;h"fB[ъ0Qz{I"hϳM>l'a- 3Q-֛cs>sxw1,CcXeOnM웍_NKGTo Fv=##[nos :_7KAKUɻ堜a?yqR ~>-ށq/8t" j}WBR*B/.vlHn^ׅB5n`si =TI[mV}bC<7j"&d6}Ԕbz^`tE@PFV5et;='RU L>-# =˳bcCej[_]ͦZIQO.EC `POag^HmZc0`IN@(jju27ϯMm4OH8t ;w,Ĉf15g @QFgh=&ӡFM&%EϺy0S#9r *;(tw$Y1RkNX# -~ff}Ss>b4iJ30>%Hr\Iú+1 w~9d=`z'XU/ b)h&щU2UE,n{pcj&5Laf&Fn?qN>.??#LӶ))L0&a\Fo@ŜAkS%CTβta6@P(`X(VEj_eHhe|߂Ed^Uj sBC1h=G &^: %v0:k.}x`='-v(mEցƱt@ J "#V˹4-cl.9Ɠ~n=vўFp:)BWkOG:)^VrLL_+t( BS:KCQ|SdD8=alvLZ|nUXpyN $a 6X{  G.AkMunz's|>qrxK}xsc{¦~c!y.8_S9g6,VHF p=CyjS!DJZ2_gJ9v̀b1 K/K5(;8tq6:v1á:!,\'n1S"11;RI{1AD/]CJL/Imt=n )dPjz[֚?H\9m2 ] )kN 2DC[a+ٿwAО%"1>[p܁sDDFWt1\YD~ceJ'.B\֓Re'S\=y/}cT/8H}S1 'D)S oFSb,4-@͸ֻ]]cuR W3_?[@k3#'Q+C>w߻U.*^vQҫj&9TͪQH(-[[޲1pv2}e @n":  w[ hUzn@r}QQy|$Ii+kn[=JF9̱ɷu6&R5n|5FEFйAyYe>48% OGx/X'ȅ沋]߆/ΚjF#QIL֦tg& 9Vzb#B$"=@LSt͍,8qU@D1u@LN^7e  &dW']i0W= b(۴US?ɋCs^c3.(@ o c<'`ݧcb$ߕ-6Wu)g7lYYbZohDT(%Uh٦?h+Pm7C*0bj 7V[oqɪx.1' B@oh#`"9*cgU."ut(_iBRqS @FXܙgd$ 9X*I{ di>pb9ldsEDǫx8K;v zQvXG8ucײbߥQV%Ye܅"!9[v]7$,Z=u@'FwIgU ^hlօA8B_~ u*KR]}H?xÁ+f0vm9B}r]-e@Tk!>2.ؖ^،1, Yl&8$bR_`@pPt*]Z8I$e(&PKQt{T~cƴe~ѯ lX$]d46zNd]HQP0:@@@1LdTD/80B>ّò+@pzIJvBY3*v|x 9uot|c*K{ Ebu)Q>95K$õ2k7I TPBa¥ϽA &$PLZ֎ಞMAsv!HMq)EԳejt;gԅ=i4P;]{/< nSe5_\ f:+uN[O#Mᑭk;5AI!q9 $FRҋ3$&hC+SleT&@ً|b'F GKWqր<)I|b-x{BpzŚ/׫$a"I%]l+b6 B>Vq;\Á[{rP-Zlm-{HEk-V=Y;oBh[Ĺ鹾 Ju9`ݘ |Jk.ã2{0;#2@P:70 ugz(XRXFn1 ` H 0ѳ0GuL{ A$lA+3#_*a؁SL[|;_e9ةDOsSz ,Ftߏ*$! nA](TtK߿d4}hOGIE>gS.E3H4ҁqFDQtIz}N.7> "DnHed]ۏ׳sؼ#2p{zKdm5%tU}m<ܷDmQf 3RuՎgeq.QQ5@SG6D4Hǭ..Rwk3--ֻplڞ]J}SY(MG ̀+Oua蕶)E=cGN$_b5tO6X+w0kdΩݮ _56Q'aT:⛊ȝ]' ӤS(EM2܂Sp[Rȕ VD vQ\Ԣ²-rj7jQZZ/J:`B 9P xp;mR-GGmfIS!%W{w ;M̂͋ 'Qu_r|jODJէC-@LqwԬ6vYW]/5"?bpsڀ=T!sTS߱t`hmQ5>lyopКzF[eсF'#z\E-#[Oqt^`E#~ɘoa<$lO;p[6hFݐ+2MܑUw`E?JPA]r]NiZvJ8 Kb$Cי?P$Z{ZR8/Xp% T[%E m_RJsa ֒ ~jY%^Igšužr]aԲ(9;v-Q9o~3"<㫯~r.^󮫺ӡC7~t&꽝Frc6_h>{卵Gs>Kyuǣi>*{b2zVEK?*z8*kMR8!KE56zFq̃|a݀~yY9(5<Էw?s_C̻ 1:ߢ[lpMBԈV,A}:`ys5ǛF b* F$ZFQ'@cFף( `G*+04cy0u Z@ zԞ/Ŝf/f}9%F]$v7g)4,_;T/{Lbܧǿ[GWEk3WpOJzU*h7i~p8)uy킢#P(Z*K:*C*|}Gj“ZHR_uAKi܎!B>󷿻~eu.q7J%9{_PU*w(^8OzG9wFՏ|Jwp]Y-OWO(} cRxuhZIX]|dax;5TOy/՝fh|Hؾj~`,yE_})sܟux{Mb*ڌ+NeS$ H$r3q/o7WuI.y Vmmu(>}~^W?tsA›#/=Ћd! p.'ovfOJVq8{(+ʚC{dzĿ6BC2 1_gѾ=BˍVy񦌉O!W_vD_KI-_z$Tz3ϟͰ Χ@Qrwٸ9J:;7f-s t3cT7[jÃC0 ;M:oJ0||3gSR1 ~)%[pz+aW=YJDP&$M*yy~{0D|ES(j{>`9X$Icr tl"F1Lw-З_hGaqB;P֞0H)<ƌ.Qkӄi>\t. !b*zeQtZҘ$F!`.Y'D|6E秬p:"ߘvzyzZJ0 = “3{<a@HwlϰqnnE5.^Z,fyf 7siY\qݴs2o`cnTJɬbaTЁ9}M%Ik6$MW~v2!/HBbN l#mIxxЯz{8#v'\ƢcLF#8"#i[KrR ޘ8E?+hDFMsݽA<.z"p*HCU1u\ȵ&Z,/FUboW@קiSE\[R#g6q=F $(QB9T0gd089USJ˖ʖ](.tϩ{^vf9^Y>:f0`E LGnJ׽'6GǕET#rE|Kg\k |S-^6G_m5ǫ8j&k/|Gn2^Q*" (^4sT_|@0^C}]Ӄ [f/M׵xg+{&fէƌg?-9ͷbqHE!{B@޳֥Wݠ?>0Q`J)/5!:=@J8N&jiQ;b2P!O2ăQx@ cQ{T+&GZQM /EnGS\j5I4Tw;Q2YЌiUAu>(QXFI_.y糤r@Őz04RFwZh^N9Yu*KfBx6:" f^[!N³7UH@_ю& -*yTc~o 'T҄ DmC&t^5V"0 vѦGGօBpb(Nnt9+--N}Jw zB,^.X^/rA}-#ji0WQlzo^__+`z {yk;&l=@8_F 8A`Ǎ(Wz<ܘ`uK|&Y&yaXy.ecru<M1  O)͔*JW#,u-1HKl1Mh_]|.(1d/ZWz"anѪ3s-@TrqȚs@| %˧V+S1=d7G7j3>v\:s~5@xgN0m0ZY24zMCtvYs0J(lh1%l]km1t~cV_%QP$=8bcpEޝj=X•.\}pO~iwZaK{ѣR[ *GmlmyWƍ6H!\ƗɞwOAԽ;Q R<\(5sT \"@m&qvwPW&pБ#e%&-6J+A;%4q$ ڪ xj~`yjC(J0$IRyb7'g*9=u|Xv<ƚ4,e"U4T^KV "ǐkLy$ I]A1=wBs0)}hP3 8%D/°:dv~SJ>l3*4JFL &$$xG8N;iqs'" @Ic6@0%ICh'ku0_@Ӕ\G{1ٗ9^iYE8hxXSeY+믷ہ8AdQOs{ftZ)8/M8DW,ZIXz%nT VQJ̊PW"o {;Z^aH;#P ث̇ LV-@F5).ZdTQTskc{=;R* GfF*kSƁj5Hq1bv1"QA!j] !ˀ1%Ғ5Ҳ5A)ydEޛ%eW[Wn+r /n8I(OKױQcL4p.xVw/NVGGq7[T66G\;"BwԾ&C,te&;y&;n8t)"PU"Ԅ-iZ\BPcW2dZg\d@f0h^3c`R@dZfԁO"WǢ( ,R !h(G ĀF+A#ISPB sdn!+ݥqjh>$T}:>mrg#?-~SuXNӧjraV~#.?}Sa%je1b/'GLe)| f &O|5g0XF)yJ{Vh@9__?hNQG=nlUC#[ZN;XFƂ֭Rև|*ZS}ܺ=cGK UJ8zO̺1f1XZN;XF .-n}h7U:%?|n݀cn2(Qwn"bjFK[hg軋aY]KJ_53e%/1_}C!p!8ɯ!A~kiDvgqTb+T\f'󻝸![ʐfȇZg9s#Uw ,}_/>;_ί .W S{}wg#36_H,y{;&Kc4h6ӏ|m؃vK<irYDRN %-a^gv>yM_z24f{wA 'dϩQݧO:Mo8yhvD4߮ZܿLT5{o[1p:gO6ZVYx'nδlOȝM;Φ\S4`b葉.7o񶼝e)p$9$9q˜HEuvy>ůѢ5ŇR4Usrp< sv?_LL 0wqMo&wh,.^mV /y ǃp9T>l1}'|= ;z}jbDpg1LLbqktYu:[W_\9g[ 6jIz|H9V8" )^g{WT'M9qǬt3d{ㄨF=ղ`Ñ]?rh#§AJ=P\rʶk1@R@Rt Ny x:%=CPZf `NyTgP8eCϖST@yI)  F9`P)dȨ:k?!pFٞ)nB85ZP&}ϒӲ@ +mi;=^H8`L*.1%w쎻~Zg̺ Q !RJGIV[Β D !Zxc%ai0 ѤIV8-O)؆<݃uE(0];Zz:^XI[/aBpߢiAp5'Y@B:IMсd҂BTH!’PDqjfG[nxä#\p!4SƭApk%j7hχbF/& ǻ[J4' O%rI{ANjԅ_+{ɋ ~k&yik_ӫ Y Ԏ[DhCQ̽' #y i{~9y.KByлiR:Z!ƥ3ɲyؼ2e'y.em/hKd0 CgBɚ'azs5Cf)22pńz.q4@ht M8eI T8Ƚ'=AI11[g1QK1R1yຌ\,9TJ^ HNΞ# J $4Y*Ȩ@tXY\DZI#Zf%H@ \T/iKCMҒRV7K}0G TxCU;T^MY/- \ͳ^$, R"NI_ǰs}>>iQū"xqoصRBi5f`&gspLhAZ󊥖 RH 鼃ncBW1 CzpBy)eL+Nmr&`$Ge٨Z%_uEe^ygw2<+@kY?h -{W`dJ} }62͇ճ 4FSDDk"LD}WFV󝑢X= cm3_|"*\o1~:5ya弤Y54Qns F{\;|,Wޒ6fv2j`-Zue 9X5bh43O,1AK`ܙ]Z߳y%3EqɎ_!U?3}r{?Ddy{ ﯃[jGy晐xM@vqvs#^:nç >Qt9FYb"u`P;qFM%#=,`FJ M 6Ǎm0~"65צ hjw%at~+—H *!!=Lg >͢; ):1jY)6C(!7r @ߍ<h v)# oTBݡf7Rв5=R^ڇK(XEès^)C9rK%_$si fb$[1@,G$1 >M'%VeHy^LAR9{JY$F*swXhuX- nQ@/ !P }"ݓ^ S[*8.0@6&208"Eb̀2V*A#G qS {Xv:o3HO?a<4=VZ!3:* d IEr(ac4D f!{?HɘfZ{T#"xxfgH/لZ5KpgWguZ[6r.q~u1Y&;Swߣ:'Gkjo5&'Z4awwu3˫u_4H7ZJ(R9V'^?Xp<Fg=Slj1_sOC}ݔ [y-^]IOv,,\OIB"Z$S]1޴n4=ZRNmr')o m݊͵n}Hȟ\Ddr#n9VJ)u_NxC[bDs['"LdwDBR{YmU95 OB($>,=D2sAG\qڳc%9JN}3 |yG'b)b06^)_w.[OI>Pc3B(~T) r9$R{ q:Z"U~P/3CB"]Lln΢s"ӫY0Izn'W!NmyoεYd/fSb=Ub=w 3y55B> ȧUH,,&/:5fXr"Q7qu{Peͬ@DU0>.oPe >E{(h\Uɫw/ӳˋސxJylik~4et$[n1Ga K$%-/GZm/ exlaNc洼Cf3h`ލ3#-Xm_NΦkUǖ x.; [X(%-BSXnL-lbWMa{qj )}`S'}`L^D}8^D჉z%w%RK,spK4\ŗ,_\m=Sj(3*bl=Fble7Z16n$q ڷ)hug3+Єf 8 n]VIm*U,tpU7`뮭W6Ua>zqj!xVw w]JYz`N&ħ0OЯa>_>juq,cE ҆HkmHE"إq|^& Lsf_ڑ%Gg]o5%ٔLZMKɌeեPЌAոJ(tHIG;qrDpϕ `d͗mqjxބBP"VgQ%Nٗo +}oioHLwEd `.~.򦕚R1޴.Ul1oZhOjpY xLuܣ -VT7(hx.xiaߧ./8c/>-S~ƄƗddPhBBTۚbk1H%l"'V@9Wa;E(#/`1K~{,MDm Bz~^Uu_5\> V07(ްp;4'΃XegL"FPk(nEZ( .Y[&h&jl6,AlgZ}/T $Yp< S$H錂:l jX[Pnސ0Nc>-H"Q"38aK"NmHIrN-|u:1#֍h@=W,@Atf4 F\pWFBzQǛ^jDbgH_@^;5*pi x"_C{`bw34n* 9]Maə@,f*8JeLjf{5U%52n5XLDUoY=s>{K"Nɢ=v%Zp ؕ`, j@MyTZ«S 4(ط8N#,c_ ~\ۙWμvU3vv6fS8RA!yp\x KwʑM-q2]`N:B=#cA l6-NEAہ.X(,f2b40(,}iЯ`Z77XܼU>H9m ⣣&0JAz3FS^; !W=T"XqسՍh8N\bAQ]w9R6)l0eRiOX<>]$:M ਦ;]/ ݖ$􂒰ۙ?`')i )6Er%p X3{.U[-f`N1ɯޔF[)&7 PT+SJ$䄐أ;&oPG("2&)lFsŎuCP>(H XB^S9&QZ@|QA|(Vn؈c$l7l Q EC%kErxFn1 XZ5~8Efa뭷.yK-^J -+ qoMܒPSB3T2aFجf[ЏP,)äރP:xl-O6@)QxYIU?Hh Uy}""4EX4:9ތ2ZHF SQrZ5B+iԛFjB/#5 a$vfd}+0LxEpK7GzQ$)xsHM^@qJ<c8pcL 'Td^X#&?7YNiPYj\D*JME1HC^Z_փZ==pwKŸ~%"pb) v3VL+NxŻ/~ônn/""C4+3?lFbEwh;HPF=3|1H1H f*N_k"~&0T+LUqYկSKL_ǵ09 +H~-8TS!I |)W*1Y]ŝ3Ir=sP} jN ! 6H8 Rga+IrvہZ%k|ofR9u6چ=k/ls 2rŢ;9OZ KN 2E 1[#a#E1b/ 5m؈3܍6JHjq9QHAbSzks_B‚5Q K;b#1xl]lrQt5Hs!|] HQ$C h"P \FoQ F% H21 5Zƥ v(rOg ]'U)p¹ GRj4 FPbFQ!x:)CyLQS%0#p`TQ"qNhsz?Pf60 %Kz3_ü&ތ׿wB.E).ZLn%1CyCԣ<~g?qϨ%rR?xqygz7 -wۻoB߾X:&.qȑ D [&ޕjHT#1=f_|>gnd>Mןko?g@O~+ Zݳ)BD6]*5+`RmϮW<M~۸ 7\/bLxX_gJ9u6/g׋9&rN_(-~9( {;|bNCv7(MCO-@T=`JȞ"Æ&,SɌa _ b ;>Iӆ#{jj$p6ςfw_3a)`aQOiXzpiT% %Q6?Uġ`L0<!:`.\`rI5f .f€>CF;'s=BZ aNǨ )Ԓ323nm?ҿoYRk[UXX’JEDG-f&=0C01LY^k*jtq%hPM`׎fbafWL]CbQqW)bNWbxE(J_uuXZ*~0~N& H 31Mzh <o0}b~g Xo= p|Lh ҪZ"WyLDe,?}\מA mD9l1n0 U.XF0n rcB?ho5 {x5 ua30)m1I [G0P/ L5Vy$]m밳V˜zR<,m9ZYړbŽ\t؇dXޕ&K?H|K[nGwZ/n7d%%:cL Qӕd0וd~-fwEI6)U ِRǿ={",vYƩnm5c4hۻ8E,w¾|, '9iU]e&.WIfSh9|ve>*z:_ܚud2~_l&D<4&Jyg Ҩӆ]vbw65y' ]b5] '&D-=;8?PJ&Oxz- >8)q g[֒уӲbON K1.C; U$M^“|-zxnr̝_v$9gM]q$4\`藖pH#T jQzGO8jdzL+8=8 !ݩcA%RduYQ߾ U{%&O!I7}yb>2GCG_-e^Ĉ(M2wwՃS43;$I;_n%->>7LIhf-`s~e5n?5loVR+e~w,~Un %-ljuo.=(|ir'S;sYOg@9՞2t)OcDז?ާ?Eh”sOw1kT\ɪdC;[Ajλo?tTum;ghÝ M(ƅDͿn R " N)2 VX(Pi1-2\eUeKyo4ڏFBzFhäD=s;DHM?';eXGi޼t“)(<0RX[[Yɭr'ҀFX-±{GI1QA# UVzl}"_UG5Vܥ"1BZmrc65 }7TļQ{UqVa9w^oGv<]QWXm\b™HF^s]8=z>r)ǃXMn[~bԹ{ZrRAm4 U_~}*QS"Vn Zуޣ38h=U5~(\Rk3Glwv@g 㹱J sur܃WVy 4EXVa״ *6;^v{GyvoV8֣MӐ6dſd%ic)dUv~FmCikL 5o&jL1G#k1J&<,3: y3 VdfG-(%t߿~}>E!4]hd楊e&hW+[l &YMw1vAx*ƫӑ8W!Pg#e+[q5y)J7gJK{2_);J_,ˈ϶bs(N$Y㺹[3u$ں33A1+W1C62\g's+mnJDԹV0 }^XJd`E$C5"Kq+0^ژg$EC]!晚֡@_8HV(ˣ_*s2]?RDuK~䶝~ }<˯Wuqu ]q1RA;iTX?>dr #6jt*$G/IYǓ K(lF5 xI,ޘR+Z,twBO|Qb'5ȿz uRCCZj]4E Y'c}̱ OB{c9]G3,eRQhetdMLDA(1hk*FA -s dIVh 93uږZUJʦ=7d Fd]6+ /J aFzaI~\Fn&WԴ^A2_sJJ b^1*u^ wk?S^5eQ+ǐ6!r]%E)v4"~2zhdT΢){Z$`bm+Y練sre?7\&љzdʞ{k}6Zȶ"TEmJ`bQ[\ áӘn!/ez7UH>yCꃨ sn*b)SD$&%S2Vd2W#. RVF] Bx91pu*dۇZtؒiJ2%9r,vU =cR(+ִ4\t˗*{A+}oXK ǝJcyH]cR`QC\Yf2f-v[I@*NnKm*^L ӴrvYWxT`.2U0I&y~ .HbnDYvJ Ӿ2Qw8(sjQ_-oyۇh ;ԕn_>׵I dsI; wAO|N:C01j$ܖιFa*j|u@ˣz>x3@(<}Sr cs'}RFdT#V׉KΆrs5YQm-uAiK=w_eѫ]ܪ:VkIn5!G5X̲ix&cjW\XK5}Voe$ 7pmj@H]NSRnsWUE=Ui:NOI"ᏣװJ$g_ e@=W)XPE6~VV؄RTI@}G&#fiadHZC s2( "yP* WN " Y3Cs%|<,wԾ`U`&<FnhyombBLx Bࢵ\!K."BPsiC\D&Aiwޑ5IT2:ЀDPn,q{ix[fJN4CU(-FNs>CBb$4ԧk2m/QY+w>$S.E! Q`<w 0/G#1:c3+Y*4\J=bVL)eA\y6-DRss e\vhl=^/ϟ~)ǥW_n\9\ G_hAۻevz_Xqʷf%u߯F-D(]A-?c6 vFޜ勛ۻ1yb;?C_>_[ixsvncŹ~5}^ɭ[SL( v2}svsxU[- Aܝ4DMMCc}i!+Pn{@],X\u2>I9}'QR%EM{,@yַv_JSvĠBZ?h4.3'p/g?%himg `.P(|Z8Fv_H*^g C p\~մ._nD'Yrܛ`OpqIMh EЍ7sH!JeSqWTHEAM{ȸ(!ަZxX6)tF'7/|냸*_ҙ;碿"+H lXj`I)rI!9E"V@'\yytP  Ft&)1Sm8`Rp@)K]ek$:i,eT(gD̼4|6'( YSZQL>ZS~*R@C0h]l.߽#@^dt YL)SJV$Fl<ŝ{ ̫GoA~uإ^*$D(-Қ@@}fqj4ʶ {o9=/O1]WЩgaPMQQ~,R^Pr Ut{cnLl}Ojyjlcx}y ;p5ګg 2]19i}0#˻ů:ֿ[>f qӥ,{9lx;,ޥ|uq7j{+sp6SP.hU`xDNRLlIJEFכewr}uUV⣮+;Zy5jKB1uzl{s1fsOj[mڊ[t]'ۏ_=|6CSc b3SjK??O'pluj[4[>ZG桶yApru<$22uQ{Nɒ|YdPE8S;z|'JQwR||~UoGQu9^ҥMR_)dM6H;b i' 7lj?wIBENep__¶N7 MVϺ7 cqFbgu7pog^RFP4ʢ7 h 7 U oRNuFƵ:ЍBj Umꃣzȍkp.$ zIXVMm ɕL_J.rKikH͉e%%WNΧ9P:V9bQ/tR;8(B?^f2,!1E^ fL(8š<.Xn/v`'K’dOQLU,ʹ@bMUS{~~#.}3UUB7L;olIFKD 3i3;)i65 x д+vbڋ=ʝ2[-E;/b;r)Z uD B9?ie !(7Y&sUQKm``<(nչ 2$Je#Dp# EA@:9meP>V)$ޠUך[[58GrƐ*T;uӇϵ011dF7D)L;.Gq Csʀ64 Ĝ<!g4#H`T1#gUI tg=E{r.3>( Z1׆Q&R\{0q @dFI5PhߤO_ڝrF1U9o7~I\.Ɵ)I'~I~#/ٹIÏWRy\G̲7s0S>}-f^NI-gdS{6 &BsAs]'{iv;gE,G3^dʃv}C/(d<ٯt ] I09 L1.0sK?wje`s]/,zd^?=t5bKI+M:{N.ޭuҴK-6wF;{>G E۟;]:)r!`rKSJ48-JsxHcQNT(Cqd<)6>2o ,/&߱uvԩoة}'0wu~1Aݺ~ /ƈ_>Zcy~1>b_/Ɲv#q‌E?  4hk }ʁaQ5J0@ZShbh?9b6ry0uu~1hbR81sl~1xSj>$zy\_%4 c5fB̍p0#f+" \A-qx([dP ǼD, W}$ [mS$[)v'bgbKÆ8d 8aE nugG?d\\sP9Ɋi̩$>XGWuj9Tu:aQ@PY']b-fsFf&5KJp3cű~xX r/nH^&ZKVOvѳ=+J.B=mI"{7J5'C/=vXNӠC&֘ZZlJZY6bhV]Z&mЙzІB6D`Mch0]\7D.m _ly_Ot5Fӡ) 3$CJDMs),D{oڄ =Gw=zL?@=꽑m^#՛5F;oۥELry *_tta$&lF auo"Ca_,ӺSwA^#Q;`/!uOC!u'&ͰM<j"QŞRU5*:ZQ F6LV} qUMrma,ejӋQ+U`.vC.rڭoeפҋnupȑhO)~/R%bPuv:dive[ r,SUw\Z㇠R1(:GuW4Tִ[zCڭ9rM)>$AbPuRv0045޲vCExJmk7.beAI}>ڭøқnupȑhOnظ˘i"R+f]~Gڋ&,3CExn0|T ʰN3ni%Q{!EKj:8Y40;-1gR1(:Gu/ 4U5:vCESx v9ڍnagݺ+Lִ[zsڭ9rM)MVC^^$(#Dr$!TӤ¨`C/[$pջN<6Cn[Y$BM3$3cH/D{GC(bLbܿ3+cS81fbCVIP/ P!<ĘAF-Z;bGcŘ%!<Ę$_YCy1) Řh|c>/,Cy1* ŘEx11:%AѪ+.c̊cWx]Ƙt11Z%AS޿\1!\$hzcL!<Ęk^f1f-5bCVIP.ƌbCFI"w1fCy1( Drrj/̲| ~~7K &.?.P|NCav̯Zsb٥q%/)bs.dm66AbRM^bd!'TZ"8x܉ FP!gvv;0?_eQV9)S*a=%)g#U+X&.ᕹ[iI Њg׷b%"*d 37hzrliZ2V~7ltO'ѧeNtǢAKNP[!԰iC8t4Edx{~={Ip{ Lp5z/K5`γ0fB̍,Y~ƭ)9;'K3/:/X̺[&.Fmt݌8+`W 8nC󥏟O%-Sh:!dn:p-`$p5L a|2af߮߸^9 p[7{u8q~ Zî"sqD=sk$E^Sa@`qȸE| v @ [|Z?RmsBHP,;0&@Ș` E 7,_T/m>oP~lԝ_lX(tR{{b%'N(ph|`)#Hn\??/G ٍ,9IhpJ&1<쵱W~ú6(G|۹_Ə=:=Œ6_sɻ=&77~Φ=ؚ;_& ZFRl|M7 F17lz8|>G%3]W\ƙWknu[dyll_M([\։')pVD»Ӏ4cIl>f{X=g,%?WnKVyF.T~ !ϩW3  * K|1} ̑@Di/A;$!{C`{Rp =FOt<Ӛ'5,ظZdD<\}{s-~\MtNV6ȣ߄6WgvM*somjh,.> ޳6q#WX; :ՕKeJ|kX\SLRv[ )i8$E AfIdjiݍ_K ~ynn_^0dBxR>ws!Ro~yu1Gba_ߍǣ4) yQ)gGTߙdngۃFC1Xbychܲ"+ Gxw]q8CMNIxs;yrYm4DbWL!'<?fFfFd,#nbYĴY_fa2Ogm]?bx:CIyzGrR )ŵ)9Ʌ,)#3yu2F+eYF[˒2ѥi]_w eWАوR"WzBk Dk6=ӲZA%VX+RA;ܕ{yllYIHf.> ? FobԷrJ?ƐO~ ګ%ݫovEg#o/<ID: rN뗫h$t5'RZk~(_u<4tVL}ȮLS*TFXs򌧔q?.3sg*FkU=78-;W@,e_p*0BAP) t:Ћ!eJ/#B`5w"ϤuXXS  T*ar D"%uʽXF!䒞bhSkJruC D }оh&Y{E5Jv&JTxXI9xbqM91^ȑ9"ӹ3ZVPG5H_  j؁9b00x158б0) Maƛ&x e5C 7`! nq{Mj <5``kQw,(>Ԝ( 'w;l. `=09@DD.Yu9zlA|\J2j51ЯNdɌ6b\B. pWVuD*Yޓ\z2~T&~+ȕ [ LnA `\ORQBnsQKt]l+?-lo]l;YWTK|=ɤ .03[rre;vI쭢Ad%s˦d~}k\8yCc*rXa(#C!(5XH85R:c9^ a.WH/a]ͫuilG KQ`]UKn?>jzYָ?z|QKYSFiII<LIcƐYS1+l57"7φBqBIyd*!PRM:ԺK֪T5ʇrdt^ z}cs RfƈZeicUuK+%rކ'kjJ<~18| U)$RԊs4J Jo!aL3I$Eq5jmLZFBcʥ rD'EY \ܠ#Z hM®GaxwcB:Tk2Yfqyn@y\ \*cM ؘ!#K9`f.NҔOgoe>Au t%G6d߀$aX'}l+A<^vٲϬF9B"StҨ 9M$yId;8,`1/냵UJPE#7SZsކ2y2Y6VġJR=%:,*Aڃ}ک%Ծ^Ob] -cu(AC8k!8pE5Ўl'z(EiFj 7URBJѻ.XRA 9c(K)p9KsE!-Z(QjuxQۗ1m PSB72DbB 2 a*wy Ye`AQN:i7XGh:ssnSjs*# (;'`BsS-*ڎj-q4y P.@ٌᐋJ<0%Jz_ث)|锡ֹqFQH̟`z(a7lq^ed@"+6LQJLx 7ޣ N7@̳TZ2;Hpx|\-ݕ1د2r &ER\W2S¦ R%oQ 块x B<$կWKĒwM,Y%?BJ:s:|wn35(*a+Ss;Z#J}b*9]wY.ڿ2V, vB ۮݬi`Jww6Vz5::*p= 5βB HkH9RsJcǙɐ hkSp9 'kC1X6 nWx%s9֟[]%1Z AQB;c2o׼Tք5Q(4wn .Z*6+rBqzWx55ATd ȡdK-bRf`\@^opz8XѮ&8jjqAPbʨZP7lr8S )dTwMtJ=OU\UFNG}?e*OֆmymkӻqO}qE5wK&cEXvki:rmڭyTvkCqmS|ĩ$R[SİNwnnj37|ThvkCq):$-O줪rԣ\U:A?)9@Jg5S-a\3+K,k-_kX ,zFO̻% HF{5vi04w9AP/^^\~Jg㑹\2?/W﮲<jW_OJ4F5 2 BCS ѻ9zpNoHT B'j j w> (`P]l(<~p B/RR/R(I7`d&r LL .XH%hb L\ nkz5T^~K<-JYYxp6-߶жJiiw[ãd|6"IjIr4H$yfkDҀ#MP؃c.A8]ˢ4؅'-߻DK]Ӧ "[{Mwl|vz:\ Lg'EY|>%;]gh{iwP;눽Շc}:ÎɜBw$%R 2%)GM5YCeYPH岁fܻpI$8W!f&Φ,04UID:';QKJ'1о#NCG|D%]"1s61i%1/]%Ӛx["MK"{铙y2ڢez;b|=f2^i.R漒piDpgxːy-UKĊy[ZyِGx|p_tf5/8Wgz8~_>Fx3~z~O٢ϖBB/Otۢ!A{)9$ b1&RXlnlFKxQʠ7ȸf [y䢎pu'LR6oIU Mg9EM<+1sF`*17;JG\>}ҶΉCihW68 \$4E=J}vkƸ,Xއs, Llgtց2[@tǷ2#胩#$v}gO {7t8qF+Ͼx%?zAL z-^CHriF &!D99eXu?}6R6=P@a5|xLDN豆rlq&zQWL.UGXn*IupN{?|kKL{PPF}3:'Ϝ" \|ԋyb(X&^p0ಫɳ]oSxRTl[mTz Shg aabwM(Oo-a{n^27!K- ?T0ſSI7X5fbhSޕ5q$鿂ˬwFe%XJa3/@)& ,ɚ jDh!;dhTW}_VefUVfw#TNb:@0Lħ|w JP݃JcMi?X;Ï@oz7ǖY! A6*愋P'3.=qAq > !" gc UWMa,?;>VMTƼ@IIT\Nʂi?K8]A?.K?i~.]Uh"L5z굋ig`~f2iiq03'ژ?hޗ_?ebg/,JƓBˢ6l/?\\.N=P[* }:opa v}5 ?4O $~6o)$$$Ԯi<l(T+ ءFY/=X0LF(T5^5x _~XkT,'Mu7ǻ1~aDc#hËFI 7/=XS Z E g#j`r󾞺h-s.φ3 ]#@_j٨*!l)JQ0Hꖕgn`u:-0_͆d[@D3o(svHkVp$RQhD܇d 淫@S}-ʰZ`52!`#b{1& g`q`-AЙ@Bs٦ ~baݭ J In~*;J + @!%(T fSϻќ!MaTHl/%jM*ʴ+hw ) o}U}쁱]S=[=0ee-0ED֒=L1ؿD]="b)Ֆ%;UmFyQ,0W2 ֠ Mi6"1/ AGdHlou$DTO0Bx׌&=,mˁp4oyFMNL0ZGxaBR5e D5ɟ%||ݜ`j腌jxS3dJ !ӯi`bC#`]Hb8vCQ,qGx:N S}JzʩZYj9=e}* dݡa! p)5, XRP6M`vWJ1(ߜ-#9WmF0z*~n~nVvZ7X~h oz4UioI/^x7?<L2ݯ'$SF,*DsE%sfnqQcJ֯QZe7K0·WBA7~P؇O⫣Ou ʑqj5ȍ'+)m0 |ٟ%T)ul:t4t /tt}ukjҽLݨuCpIiD!&iʖs pyI<%BFw]%{ 5e?ŵWP/4%) "@MwdrlydvL,鞂kg7ttx s`H _5d)8(Y*yDDo>e1귨J5g sr%RYOǢfDI.KyN'-|YwOdq\<-#3xјbKC3g+;#,ч$ԩDQ-'ѯw5 r.ZYkI 0=r#oT5L{?3~={og0(E5\eux{w{Wgwws7'3#o'p a'fqKGO>HAt\ٲ{ aJP6aM'y^|9&Rg1d~lƗ|D希 }S\{lSs':f-5桸bc-]];x됴;3_ ycXN* 7\U⢚rmVybМh]88\(\&(YMHV 9|5>QkU3#}yc|ij1/VzԞjMГ̢{t$BDs tKY0#FTBm'M^z)`p?{}Ӥ =ˈcHYGV&ꈦ$ T,ܓDaq/׽2ֻEb.=ʇ0P9Or$ 2%lTbZ㛿{b+yLJ!}uLQg8ݖ.WZzWdy wo <*e%483Q1mNbU # \!ŵThbg:x3t=Z(48ciLY0*%N,l2$Q!iS%oq #Zba)|aFO#l1T88331^0R0]ϻZ lP' w=Å9]i3Jv֖B{ "R }CFJ.r#;rg)ݍcE`4g6RÓUNCUSIͳzC̓*Z>F1%'jʝ Ijq< &=}M&;( 'ĿfIGQ&J?QwPPs./}F+zoo^P=?W9g Gq9LT'̃ߠ8| `NNã?WLa+sJRc\B4G,*.7Uqiġm:mm-_YEўw|%t)OGn!MAy.m5PDX<ⲪeqթJ?\HVzU@-쿬S9W-̓LQv჋f !DI%;eAweesIpXm+ OX3^*ud( *LI1KD+\ %D;g)[?eɐhCN^([E>pX u9mτ׊bbA[xfz@z7zAvRx/)V o)#D0f^5atE.j x]2o,qڣ#A" iPTX5F`c 1X D-Qh)d-4f~*D>o$ }kӕͿ? *qyeS7'aDǏߐ7wE}8UWP~>@G7]KhQCO^uaMO B:.ZSBVU KlSw;wf2]N,$8I"לv;?4]%fs)R1,Ofn淹vW2)3qCR}52w|Lǥd.|kX\.E?o7m:Pv-1"rrxqJhV1x)<7&J08q/0-.~(&C73z%mF}rCb<NW(nO.2p {ƍ%˜Kp6,=d6'/$i+eG [MJIIvSHfƶ]]U]B<: xYlqa%'a'MEcg0gЀPS":.Xz AS,|8]w6+iM- IcuDl״LT?1ӎI)鈔j1va 4s 73j)x%j뒄 a^hj"R#XJHJ\2r "54=~IQ#Pdgp'b~mN΂ux(D΂\σsxl|3ק w^g;l+nY{Ϗ?$zqe)H]>Z9Y0uO'_cxv!l.Rbeo bL_WkaasoTwIϕ#|>9^])Jo*>>Wh:PlRouI~{+,H}E3Fm7sv6r/<ܟ1 M.Y|ջW^KV̟z|0= ϊ-$Ju.0 f*adHF7X]2}vego"uc/G^'>\X`@^ܱp9;/M)l|33oQԢoTף !XeMf ]rƏt{FX TsQwntп\k`G祋?^㜡 lڔ;h>e@!|]܅]E1;\ oCkP"0Kct W:c".S,-gR(JIm eJt|r>CpRGFgr"C2 -92Dnlo^φ\ʤkFe oCKZzc{PJ` ,E^<dwK(1?G0/}ꁼ(4/Qޢbulq"i:ĢR8+t%/prዢ.Z >@YB/cHł؈*Dd"22BL#a0CIm6ݨt x6rıDZDs]>Gl4EX\3Z ɲCrz镍&y9קx :ACe-`-?!DFbڰu x΢~*a>d \Q@4e]h)- iy W@~]`%-jmX.Vb~7BJ. U<_]?/SJ |jelq`e)*MT&VhYcU<9/>mڵd\XSo^4rPM"o~a0 oz:b>ݡgmƪ.p\\I&l3޿j0PN`K'҉1۳~uɵ(Uy _>لZң*);)ӽSq;OQ+R0C eh[J3c0)+;Cօm?F`W|͐XӋ L4w3Nkp:Iv^`V' ǡÂd'p< yޮHh^C))'S4NЄ1"^o!A\`ȑJNp)0ecDiÉ S6XSFv+pZnNV@B򼖤h<ԒԙVx~$F3`}tgR GpV{Ũ$sdu2 D F,vrj%%RPʽfѬh+R]Mdh%G5ID`$ ZWTC7ΫP ^C=<%'ye3X-,<*{!k/c< \[,#>7ƁkmS.!K.ۑ}9@gnqskٷg/s4$!0N.[`9z͘MLL,U Cn4vmrYZ)sqmSެٔn1H8a{ԑn3UMM5)7΢MxJttcnMq:M#E{zn͇{J68䍳h'Or:JwsgUO?, s޽L@HC? oӿҊэ|EI0ݾȩvvwwjVw?93NrS/><]]e.yx6^sqib(I[Z388Ŋkܶ)BQ47:fª4J%R%j3SDOn$5Ø,n-],lKH1a8W"FS@"),s%nƑTaUYMV\r*qtJS@r\!u\ [vۚ&9r"Ĭ XT;y%~6I(kp*4pJ S7cIH"CU"4IN,Hpe~ک t!.YI/%c=? n-8JUr Ƽ:]gzmƥa:kPT ,uuֽ^?rUנqZ/DQ^ܰwI2'OpR:UGQ#…`2>Woы Ln/9a;^\dovtb. ހ.wѶt|29 lz:i1T< 5Fgp~qwx^Dbqwvz!ϔ4ofMEr?9%Su;$6 [}$S5z1rL1;*#I'+l)ߊHӪMX\.Sa%)#D.,67l]y v~?YIٛco0H,2(4q,*TF!IU"#&O}V_X(#޵ #'"bb6&x6k!=UaU`*?(t`=&?_ 0Qg; -bi|R 9ư7f-, '|>d9a I(o$5CYynfT(DBPoA|!v``1ڈ N=ޅ$ I֏0dDizEPζPfMP]7D>GX(^? ܻ\%JRNHBp؂ujHi8u[vR@"۳^.t׋P;x-=ݣzd[ G)M%; |i T۝{^ [#ӁbUP& 1\xTwDNg%IW V"Фԇ*cKtiʾNb"78TS*FSd.źjwd/=W7¡/o\bmJ4V%<2D֣)$*IRBpEqJ6&j$%-~hTHRL)cdqc7FR$KUl49kn<!zQzj[Bm`Bm6O)1MѮ/6K\KhE\c$eX )R\XI8n-hTlSmnAk։;Ԡ@=:FoN+-*4qݪK@nPE#42zrDSFHzpQ̩B^;ܑcwUǩ;{)(3z`ܽF;&PUOgZo/Ud!(2 fWa<9=YA^] ~nw5Bk9Ni&&`kQRI' E*Tx5kr0&yݞ9J-UZR Yyb44\qHNn#cg-#}K֤?g3ms{i{X%o75^{Z=AĽ܏uTjr}=){S讆ZRp |ٗuQhֻPŗo Ë;xHi(<4.1Dna{:Z T.02Lf+Xuv}ZUʫ)8_W81&drHD7f)SJc-1W˖pC@wL- $-0D"Yp 3h;XY%U(|mQ=tWCkbpG([eD?ZljrJze\زKQDvumb6;S*TCvhAh\ U_  A;!(bb"_- J(T㛮LbH[(Jy,QKjC(<ZVx&jy!VK҅P\I612 踵̐uh|'49L9e1qAӍ΂OYW!=\l7 .ݞ6}/iތ)'c#>Uc>G"ZXX%85H4(&Ju$D7EZXm3 jgO]2} a g)62Fƨٻ6cW%[?kEpeP T4I3^$E g[WWYkNFb`6q'99g쉠/ALAԹH ktS2# eZYvY贶؜oAc^<}^@v4ȥwօnM9v;!V\\fu ƪdAb =r&"p{1v2! 7L nVrRek4.y 2R&Ad "Ad[Nv;k=*3P8%XĮsnlu7uFI#{MhuOhu2w_ -c Dcر4'N]{PKGUҪMձhQ5n5p|!M,՟($CR1FR!8J03%b& WЈXwd2x N&{8%|`3+3$h"XbSW_kI:ﺚ)S^3痓~dл!O%+뺘[_YkzuJ_kzvWɻ7 ۓ@\ޒut2/~yv`WLO_8Gr{S⩎^aJ2HQEfs/7wⵐ=ƚ=~ñ-i!d?_ dٜ]3uwc`N9 %6*m $DZܕ}Ow!p 29`YےYPm/'Fgǟ&QιCN" s.v" ֫łEĪ6>5Q[姧6*2ܡi6{{Mva ZB7h|=Aigz4r`v?-_ك[̫.4Fέ\\OcGe% =oI}mu2#%|ϭQʳbGp-#?}|΁7yh[M9ò7@7J[RsA2S"m!iqN2FKjVQk[-D j_T#uHebV S6YDrzh-4PȄFlFYzuePo)@*F r.v1tB 8)!E*(cKqRRv:<LEDj|qVk1}ݛ~|rqN>keN^^|?eG :*ȵo/6A@t t(?k0αXzVL. 6AbD)=lՎ3j' %/mRln H;_KoJ6itz–\"=2刏ltc|}{n{M[M?J G0DvZ[;#3amdQ u7+Sf,uP۶&gPP`ϪIIX3)ko9J,e\J6tU5PC)E6ʡc>=fO F_OϾ ՋS\}[V\-,cdwh25X^;a2$¤R(*R^IQ`D2NXp/MfzC$3eqOB'l"œφ)q9|"?^69$r]Կ&`\?Y(*q)P3O?OeWg?'ss~vip%Βze^,iԉ%/a K_͒-^Li$t,BF5{[3c~>K6aX&2hx\.r+KSYݦ&}?~wӒA_}xn dy7ۏxғ`܆f-[PT'c9N~qV !]s5IsHvAmKi֚~3֛|3I15suOi*iMU ødPpgS^ʙ# . !й7~2C3 F+uȂ-jD$պ$Vb/U]܎ IB4!]"HpCf_czm69Ix6`rJF"=Lk,Z(lطٞj-E-:{rsZGnLkZ?T>?:X4saVDvkg'=GM};R} q8wcFxI_iKZɑ{wryZiӦ\6OZ D:R(+!=4D9NuMBgT>iANZ}t@KaЖh7Ch PQkwς$xSfU5E퓱.*+nG5祁gm{P7!)_"=%8RX;oCl!%RiuD(ZCMAI\1=]ǃWcU>o/&&+! =FQ9p>}&P)CU G8ie$nyy]X~:Mx{Jk7<ˌply.軛Zpġ/Lx}[GGj[9^xȱ8a3wջ??y=9w&wu5"/p_L lyZ &[H׋۲Ֆ-lRzdHU"qUJ BF0i.zD@ jU_ FڦAj֖(ڪbTc{$da_,+㥜cv(N˘kQeHFP/%V;J9e-xg5LcrFUu#W,KcrF r= M4^.F6B؆#Sƙ1P[fP-YEU \K`|$I᧵:YMQN{YvV6Dq'NeKbpЦqByT-Xk)}AL6[A2mM30KWC +XJW Rϴ0Y]Aw(A -bߧXjd]Zm3A P4]ut}KȕvY#:Y50sB9Vᱡ ۩ G혤i*lQzlB*Ua6D;-%AaƂK}(Sc`,]as׍vU48 U_$,HIu0 cjD",mSH9cؽE?ߪ axJ+˱U +K9SJj;IfM;׬QŚ=gh҇)rhOWӘ~F#`ˮw*F-@#r?o~#e3PWS (fddBב -E sNcMܓyKZnmFTt)$},e].3"*AsGHDK@ЂE9tƙ`7Gkl&Q Ws>?_-FR. B^ H9:o8ANq̊%ZɃ 4bA>}{&qbך:ɇX6%ˈlׇ_ӌqww>Z02wvt* F(GN92Eʥ2\+b)54 ,OS5uFԇKxz|!2=Ajn Z|{%HjP.f1[[N]n'0Պ$K&4,_fL.f{Y"eE}laƴQHv*~ wa@|S<!?lEĸɚO5Csd nYDD0>Kb]5q DR!4VRS`5>\1~]c;DN0g-%\.b=dw}J"9t g۷Ykk\Tu.-dq>GwfmQ~QZw08߻p,mKN8K߾x/R$~nݏ\}#Kk%LDMF7/1(^ѝB^٣Eͪ|,7$ =※[ bL'u!}^ܻ[:a6&bS~|Ƥ[ bL'u!oA[EnSX/n[6p*^Ʋ5lz9^ܜunCrsn_V_ۈB# |qE;~Wcʙ0*x57 ź5 ':ф4UVF񾪹wk4PQ<}bPZC9NaEo1tv POp.C9xP+R0AԂ%@8hY>}jmEhc P[EQ$^ _*yR[)#3%ii쉗FU gw1_T_?V.$NpPqr4[QgmkmNP:*{] /&VZ0jyyJR9RCx7\%)*vL {OlU܇F8S( $zLv?W7oTnoTd"E$ʬR׫ & Jᤅ/f,U^}Rf@apBLք9C)ߙ8Ot^TM&_4UÁY2c}Ջ ]C[d LQQGt$&sYQ7t#cb" Q9a6'nPnMƄr.UqPpBu֕SԾQPV )Gv%5Ϟ][}h%W0CPP9^CKg-%QWS7բ>z}2U`Q䒛.1Fqƣ|al"HmA)Ѣ4Y=߬RVDaT o_Z;錆䘢j|k7bȲpLXCw igNnxTH F|;8}hZĞ< Wg{p ،*Y;\& )7YS¥GWdy)]n8g7)KGo22>fIo;"UU%@en^c<9{9ai,|Jfw\AExJ54piŽBaíeOJ~i8f{4\2xīWiw@Q 8b(VR߶Ovq>`O1أםQDuˏGQ(]/?im+Ks= R#˹icЍ(O"m]bH 9}Mu{o7% ,r7xhļO aqsc>Y#y4`as&6r)>~Axw֐%gI Yk$-6r,za! &wm}$xg +CԢU*@pؼmx?3s)* `Z\A vpIS+AԆsyPT"Zjc'3*:taж([["]PnU-D)y' ?i{o,z  xRʵX%u,lt!sM_] DH;%}&-SxGӖa |tm犬Oy%3d@OM>D)k굀:V@2AOY*J^x3=F5 D5&e)(:) ]g졦g}oøyf]j~@2< xGA=ƷO`$l”XzgP"fO%ZR_ m6ZfqujWCd83?1tfDцN9_k'ҫ?N=Mb!&ç#VS꘱0mr{KDz %=]6 !+9.[QE}mUx<@8\O5o2{/)sC(YC(ԢƦɅ,71<=<\"qw< ~nKpL{ZZyT;:einT݄\yiZjjMD^``pQ ex x *f Lϛ?œ{ެQ+x. An'Q%e.A9C&%9 njsl#*[1T"qHjJim 1ӆ2{'mE`ϖ2AlfB/CWQ(?r]]+X[{uUpGZz" khUc<)xTUO1sأIҸ){Q dlB, tlύSuh\clѵcwSNŪXh4RGqGʾ#(1iHa{k'+=^_r^_Čq|iN%;NGuOΣ:պբ,ѷ0C)u!ؤ +Z DZJ1%Xxc<W[lXowNn,)Z"2۽Sf>x \dxWWYlfP]q|;S'4ȭp<dN,Gvg42&N((idg񺸶|:jrF"# FQ'/2j\58EJc] & t<ζJHŕLs(1 .;zn-W\/=Uf%!5suh*Md%q=S ,Kv2.~V]r+Nէ:bUspW , E%%DI Du~z'oAU.V\Y BNVZnH8<N9_->|sa{}ߑ[*4+-ܢGQ ZW2:"A5~<\-){QλѸyQi4I@f l ]-A1!yLP`^@/ ܿY&KPmЬtS.v\">2 rզj0"?.WS)W9 xpg"=[>;0*2XrO4*[+Ņ]TA -Y 71Ɠ=#A,fEי`eϨAƗ>+P{'@I UǠ+rv; c\(t ڴǙuyu) ˱U ?+y<6I`szFǀXrJ-yL8ɞp./.KMq޵7q,S,Cu}8C H\bj^ :mL++V@ؓSXZtOOz'ԆX|j VcepD`WxE2ED0SU.Z8#y~:=/Y- (QyXy ĕ.q>ՠ$s r.pxf֕0b(rRG0N<ʼn$Q%+R`E) }̘_&x^/Rd^W:0xLcG|Gdj%@(X+ !<$wq% :-RZ}ck)]6+ʢqEW))HZAJ2%̘RU=0p4#dLY'G,iJY{qAHsT2]iI+R@Ժ!CRZ?BnE8j*4D+ŵɍVŠuEs.~v >\墬*J \->"[~K5ܭ -gUrx)@U?wϪ=S4U kOrd%%$`ramKƪN_{*=5[`Gi+y*]T#&[uJ ¢E Iu@ J㦥.UCw(le,2ˁ_*<9uG# 赆_lumh~UjMm\yT2,\j((#j ެMNyg<4ђE# ۵'g̟ݘy<:pY č0`E7-!$'ɗ;mzE69g\ImޞEzz_'M<7iw8Ljn}fSj(i%G 1ś{Û1y77gsi1z߇nk.Q(-o)7Tq eb[moia&./DcEU6*֠@9VBq"[T|d_]P·kkrKUV+g_e# (:y.|ՋO@"4%P@;zKiJ\.zn"E`p]߹몦ܱK -Wo~[j\jh=pͩ Nv}%4EN}ZC) JDSc)~BR:LNN"=/&45EʮPJțՁ{q{<ҌXn"f$s5XH^b~¸d][5twE\UVV&@eqCX7Mbvq#65%L5'YkhN"oN/7րgH4 B4Xg(g ꒔N@Ȩt}櫉}\`xsUuPSG8RRgU\*E}0;&DrFB8u07:wkA6L0t8ᡩ~jdiukbφ0P wW# ڢuפJ5 1m $YXkKaak">)N-W&c$\JR4diשּYrc3>5(3ξ`}6!;_rW| "rr1\⫔5j׳a_8{s^BoΩz:QA-p#P1vEpv-3?y{&ɵbEތ_r"`n>;Oy27qTJŴ$c4 \[Dž {D/%6D'X\Yǹ;ў1at 7-N*P8"A#5 o5'KˍQ1]x#lG6vU< %Dزز/?5GpD8-+s GjpN/pG=wK>9-w!$\gQ0J7zz:L0$:4չDzζ.Jr0WԋDgHġHSDtZ€aL<  u0R*=pQ+ [e RVAp<}mH C%10XI晤S`W;D3S(r"ĕ ,YP;ARɝ3s 6e9 .Z_;=(I(I(I(I!oY^ P"P"DJ\$Yh&33g-$ZI,2Mj 4+Y['Z3Itb NAVDUҠU@(% j$f"j4N0ЩtHUn+1?x`顅9qZFA5`T4Y|mJ4N!7J\ `1-$ ) P{Eaq%f49:,4(/A1ZA,ЅRj25(5.9Kx$VJxY BnDHc8xmPK)OK@ 2L+e %./3JoF7GL3sy^>Xvl2E~g`o#h:wҎI,HXbXﯓ\:Uļjhub^N>Z~k*)F$X24X@O9V>o#$Jr[mb,Q]//Ѵ{0Bo0Kq8 '޼{ -&l,t󏛍56{n[hLZ{YbձNQ[| $uزK)1@}։=%"lH>T!OG\>iO Y`LØt;|Mg@|h~XdUi pB㊕^~2Jo>MIJJPȬ>ۿxmjJ-ryG\ŸU3)E^b ŭ# &*Gr~V p:y{Ș(ę7 yeps 󴅍'O3sEj.ZT{Q၁) M,$?$F:#l% }Z&I/:q#I鍏йE-e~z: ۹<.q` vw K 7pa2ڝPɇ_V|fy=}<;{'moϤ7 >tAu{o쇉ZO᯳0K0m;_ӳV₃J@[=+ uOLp2gwXIc@W7~<莎|?n3XĹ1&p(Lr_//E`nwFv{{>O S:麾F\M'6lԴej?S` GSw"^?_dN쯇{o¸Ö=_L1o|.P#FoL|.kL#%et>|w0pӷ~z }?jaK]ƗW6°; rgae8<Ά88Qkf ݕ-^+^%vip'~aP`_G-+ z]wz!8+m]xmN`xvA|iۿnv? > ~"KnifVd[A" gr/-B,[e3sfmNaLoA oKKSnn a_q0);]ݳϻ}7[=![3?M6aaj¯ 3rZ'/bY%ቸL(4~S҉ON2uLMn?3*[?<1`3h0I" s\p9FG_V6NvU +\6e%k9(K@)'#qh>uOO~͌'̿){=?cмL2?6{V.3JTS)ГybmH5M%2b$1B meqAFXKÝ6Xkl]c뗄?jl]c[غ֫ x {K;hD{& }jQ`g݄DEP$d3\d!XY54X`reؽ\+fTJx( Bj\]ꗄ?ո5quW˷~+FVK/JC>a&%ΧZǁ׊J?xd&K]ˇ<2DU;W۲xm#t&wKro@a'aspf(f 8 V2q#MUDZ#+j&AMJ˓My&΢?1Yxb|m<0̼12a!$KBFq\ `*ͳ84wpC@p0w BL$_9e,]|(ο~?5BHP֙ZOнG6s ^M 4J9CfHtH$v%Ajf/{׶Grdw-N"#B;րÎ!䕢4-eդľI"l3NddG!ݳ { m0Lj!b.ިcJ Y[R.rJkP.dmP:G[ꪪqɃzX%^0Zx<Q/ϧC'SL>i_U,nҌW,[4IZlv1bNbd( eDcMis<#Y(@5(hLI"7;(7H SK%l!T6WS pW) AIk ~AiKy{Eg#!YFet^Fety!g[-1jM >Ӵl+lUu!e@yDkN!SH/?"ާJ KƐD6.Mq]Oč.Vi7[3XZN>2کJ%AB҈9H 2 &$223r%.&1-Y8Hrl GDgH-DCVżA%'ʢ`(KJ dJ,W7g!Ce =LeYTሡHql̽o|F_6]Hܕ(^5P[ ]S4jg+u'a^B%jS5J$chu!޲p^A;(]Y"c ܫ \\ЂrAqHQf}B^hׯk?Q9!܈zsb鹽 %)/9/.7jFO'69M& l܊$)"d/a{M6j_g!^.`&?U]h=0Ҏ׽.<h<{pӔysRs=#v B]·GߋU6l:a[Q\mɿtO:[ǯ/?2f15˴f<- #^7=f~^a ܯob@Y l Xi=L$sQ7GX8qv`7i| |/ᓔ f3p'户xfa֭Y jM;Zږߛꑟnf7,iڙwFv4f] d`AgV&D(%ʁ0YeD9j\4[ҺՓ֜iJ}QHZCFg5j sG!\>tVg^SWk}o@ފWPV:X!v6۬6sGv[m!XuΓ1atXgڙy|6L#O͎؝o hwGߥ?=K+OWJpO}zi<J86Ӧ)v?2ѫj"cQE"䘪#Z^"*688sƀF~c6JrL}T~C>*ud렲*K;,PI[r}PTziBl.]@Ce9ꠦ+rGk 2! $'pyiKځ-?`܉j1e ƒbQ)hj-xߋ}{X'Ab`2#o ]Ƃ֠ n;qm7ukm#6rCrVZv;D19TOn _`$82&JKcpf;7x!"717Ġ&%")*kNpʄ8w(1diQc^&(n6((J6szi8~\q.Uvx}q39(G"AzGsgZDϴ֖櫷T@eC-Pb*Z ΁&zI(ltP(Q Ux@NlqOTi" =/Om=S8:Un~ ;.׃>Búa[wQšT8 *O6%ΦV\f-B6Jmc> |1eZ.vQoVhVhVab>;Y+94? fFu%CŠs' (*+\Icؾ =}v"kjͧj&<ރZm}ҡ-}h^]쌊IAtHlE'"TV CHHdk2L4e1Ole>oC^}˴zWwSMwzWwzWw޵M8zm˱wκsbPޅ׻F~_2083pֻ&w-eg|)k\2jwl`q3kT * y+zO&MǫFimQFiwGiui0r>إsZKj5w;Gvі)>x#͍:e?YRg ^h \>(1>W0̜rAk qa k "&KNh}d)ʂlq/Ti䶑c"Fnm䶑FnwGnX{*eIZR2t6n]06vښt$kݎ#` ρ@={g{ vA~ࢵ p۫!l|L "`&1C1QЕI *PR}M@ 9cme@{o'M}?5zm6z;zM+wߛj@]Qdե* w;&w2%- %Dvu:8j{j3sL%R 95>@F,xѦ"ު ڄgx?T&LՑd~Fkmڅd2pݧt1~Fcȝ-:W2vlu`ՠ}2>="7jIhyXoktx/E"۫Ճ%YXPV͌B`u u$ H B5 m"L :4sLW4O?d(k/pysuIZd9DIw.$:6B8ĚHρ6 {m,Y-PמI|~43gu* sW)ANdSDX߄_~w3{ͫ2ٜ[?|wFͩO蚣D:h?_1 ylξ0,-Na$#wRߊoXOw@sSh %WM7ͼ~MM~9}>>97jlƺ.<|/\IpZM_{ 0q݃> \AsE|ߟn$Z፺Cfo^2eo4 xqqls4Vo5zW rTj{|wCp}/^o+k6`k?]Ӂ}X6w41߾YU~|kEo+z lIX5M7\F .niT`M|EZ EH}!~9ARJ)b̺P 5v~~O;֘I {j|uljzj~{ͻjmyxݎɿq9E϶tǗV_Z}|iZ}L/1ocRcN{8 AaCV.g LY6bocY{bv3z]|rx<Ř*ȽuRl 6b| 3%<Xl>@ޔʩʕB^ Xkς&4$pDG4}S J/rY5-U|ф#i9KsWr̡Nb.+ڷ=!xcvy !JPR*xeT)M+1 394c0!Ms m&DUz"2- AjWoszֺ=&ڀ6UPVY|U M05[ݬS0 lnM"MX0d"֕]0y؈ۇ I9BWWukJ +QO4~ױFmJ,];C{|~GRpN1RRc߅R/D 9$џ9oz f({P>OZ烽Hq)R?CW  0(i)ȵ9 z̽BKPPqЉC0&D}J&ĨriRSU@$GDZ%Lz-W׿\_X'ܐu=88D;sc k{@ ]2s'' H}L)'t΁pXdfi{Tܝ5^_(^4>J1to >tQCdL~|g?3%K{.JZ߫*՗pk%^_P.S9ʞ/xp3{Q xX|}`ʯJ'+X1BpX{IQ3O+<kKEk ;05na;IٺL/54 + Ǝf,V! ZVczJAn [N\؎ANIvZR 1y?-OKj&3 wx K&'Qy֠QɎֺ:%8DKTs_ X;6z!d ]0枇68 ay֠:3",ev)ڐT#B0ǚu=O7Vxo^W]WoZMgR]w_Q.{]oϱ_;{?/nB~*xMQ/*b]|]2uۇ!xˇ o?_.Pr]j+)ƟSsSK5։uRCN~@Tqo1['#"l82f"Ȫ#UtJR.?"*\Ҽ rU ֺD|c|dhZISlZ-MVV0{\K^ZGKƬaᒤh"k5޲W\mr-hDM2g?ɘ` fzLa-9&ac%zG6 ;H#MTzK-lFӷs}nM7Z z-(l=zg$fjpvcvDXVsY^*XF /`h53q=&ҍk}ӹ$%_ٸ6\X9GҨ>]E/w; ޼5Uy+CJ" /6݇EК;WmZU߹}w?穸>:!Zg`Hs6Ws ӺvoG[q7˟'MY OMO[RaC `^{ƍ"zm~8ki]k"K$;I7䮤,ۻڕBF"38~:lMMg&?gx9֯.;z[f몤{]pD}vavr9 `8ڻ xroC,0/|-=8:(k}I#AFS-pERBpxt'ĵ S@f! 42qی5R8D`.De>DR|kI o/v%:VT1&ZcC}} T zzp9C/}ej_3ewX&E~2cl9c'ㅀ^rI؞&W5%__ p8/@=41ۯ-IOMv<^^$DS9H!D85("Ny눱2 (hzܜ?^?_3sKLoW+E4D$Px.gnp=oݨ59aC=[AE+LRC FЎؽH.K^o#=VNxy].f:y}oUT-k;{ZBǴ K|;ǝo30GWLsq:+PBXB.,b,A$B]{{77'\_g~~V\y fFV (NrXFTUD`S"bcÏ'A_;YVإ(Vݎ:!@%0RogkuNCՁ%ghcJh.݋m%b Tdbٔbm"6F&dX%ƀ(W`-B*XF^9Yyu'2Xɼ6S3Q?!hۣlLƠ;r#M2#*6.ObAh1` I\>I-Ea颹֨ %d4HU<,WHrjռ]q50vHLD* wFV,1iB}Lki\U;j',H(ܑxOh$C|.{lDE(oIQ% lQy*"+5{^2-F9sj^Q|i@u4;"AJdUO[aYn F, &̾5^463}g_"%iO7 E>%5ژ'&qDIIb9px62c[T^Ā[zcχ=)[rMqo!Y(rTqL1R1U ij-44ܵ(1J0Fk|QAdΥ2'e{mXȿeBfn+/oCOu}azR mË}@ed` '~k.cM!'Ÿc.cx'}Z)>]i^:Q|bZ?)-8+ipć`:%QO'?#=ܭ( ~ 4loM޶;L'7s#h,΁0"s PEo~o2u˓h' Yvx\C7<+rƽ0oǽгL޸sxNU믾|VHx|?_:vˬ˅y婆-\pRvK0oMlW(U ֵܿk]дL%#uJއE03!{)ېK;?.peng'/w h+|n'ӄ^BycG^.}?~q/{>`:~ջM^U'7?ǽ OpyIĿP.x׃t|󀁕6$Q|7^|{`N@wM;׊uٙoh@8=QCm] mM)-}͢ԭu4TR+kѹЫc# Pĸ&G)j.bM8}h1_ܟGA>5 wp m_ Z]<LOq/— Z+IFȟ:]SsuūuۿLV-9]ʍ^kflg&y~grGsŖo%Hjw.R 'Lߧ%Gyަi({uzm-B(÷#XPめTRPˈa_K1uMŌ:EQ&e+6WQs%q琁%ōQVb8gƛ1c(F VLk%G__RKWy7PJbYSV6JpZm"R/V˨Wzlڐb/m%)%k v=kcL합U9 LhT`8ZVlVd]רV^t65$Jj%k}l$M@21RRHʤFV';;.qV1V]BUDF)ٲ.KK~ ǝ$΂~9!&RLIPߤ@ R,&fj ZKM-쁌b2KIu6+%WUTfꖅ 5҄P*VܛIXB#>W`H Y_]\Tkl(.dE8 ZuŸ" _(m@a*ld(iK %1@ ҈#bnڊ`*PUT8eKiVQ-&XGíUPtۚx UAI69Ln}P~ATlmkm7EJש^R;ۯ|G1Wn۳Re_Y WV(E2~NK td_u~w>~6:w3x2tksT"%:-6V[5/}|ʰdqk{w7Z(t[ 4*)#m^q]qeZXK(!5Tcԟ;(;4EHDtw^@UDhFS B~;ScgɃ֠lsjXRzv.qiǞ<0񞟼{_N&?r $.y(R5h wHpAjvT;9 bh꡾߻!w5%p{/P(Oȶ+-Ԑr5[)sp|uu`ɚ/gUm"WJY-am0=n4A@Ws:k(EQRO"/ư}2 2*l()4%lWuyU) SN,HIe/#J9(L-FӨ)qIvTn/$P|Kd $E ܓ1j<{ m$Qr5;)r1%KZD#wbh Bj b92/sF!ih+ #'R NeDlX+Rb1)Jc.#r98pKtY#ӥώX*(t:V`.IH*558mPD8+0:QELWE^ˁ/Q:hI+R$-jE h*dvu_ł^?B-\5ھd`[yo9zYDWRmSn9+ٖ9ٖlٛMLn$"I흓q=$DQƸvPʘଡ ;3 E?`S"rXI{#Z{O!GfnZkNVg/u"xZ hEVJx8V"!4oW里9^se"!"*X2@ HPNnceEco1"ޕƑ#RY yn`z0=y [۲1nuHY*yTV.5֑ F ).';ɨ粒lƈ;^xb$w6PH0 g|B5m,qBZEeu@U`>0`<%$=ATH1cli{ IyW)f>W!2\[.DjkrkxHy) ҀGuԻ@@*2UisGl0;z$ (sR GʶOErOV|p[K,?"Ӹ>+ynI(u$8bLXtbNX#gȥE~g7drpTNEPc"/*\?J-ͭ=~$ 9Rip1+iHUd8UQQș vU2nWx.lr E'׋aIK|]PېR>|??5<͕\؞x nr)KT5 o_FI9iC[5|b}H6-1Vc&lwGV\3D󠚚ˊd+IJUA֖q0OqGMS-ӑ(df=I٫UxPRrGЀ)q,^4fR>8wK3HZdcǸˤGIN;)^񞼐&Z = x(e2ʃ4 hH9}+~E]U#nkqva}̩k˳!2'qYLc - ҽ-!KvaRWqmq}hG c` 2zҵ!5Bkˈ)7OآR8hK Tb`="sVctHBɨRۄoOI6d 8$%\˃{H($@ tk[B:Y2&0c$Ȑ(A$`536sL6P P9.` sR:ύ,Vȝh\|)jU6KIzFx.Z黫9@?9^9kUozzV-M{oΦ-X $i_rOx_gNJ%!5Xfv%?-v~fEʹӳ~裹 c.V;OsDgHd+Y ݇i]wF|J(T?ϝƹP[%CݎM, )^%kՏmd!\-9@厐|*iKn, z. GY)fb Eq}c K2FImb¤ˍ*,:9l4Jg5XǼ{,F:XR>k [MԶu94(ȁ4ZT-Z $Rm4uS yD`Хnl<] 'sr O\Ig )FjmA~ _m fe[4O/`B2XQBОiKV̅R$UKiqA%d]w5X\! }L@m48[,BSRoiZO#3\s,EHräqI‹g_5X qӟjԈ=۟j#PאmsK{/?[G޼lc uSs7}^!ug^H}o mE}kpxr?ů=UodwM_?w&Bsu[[jCj.6D̓FҐ\E'yl8PwtnL̺ ?*nchWCt08uSq@6%3]!df:6Jк!_)Sj{kr䲂CجErk-4}c-1(S 6UUw zxSg\{Ҡ`OIyçӓKq;4|$QEi֥Sg&9~׫)]ޘLF}qXVs| f#JEU83iZ 6'6{3dGhUVhU2֎V=ʬAiWӮLn^`3^_OejtLTNxV1HLxl!Hz 7AXD-N2'#IiύN%BKC2HT/ِ1lz}C%ٓ,z|)xS0}a/?Zh.o_HKZdB!07Lq==_73,$eTĝ'cDo ư/2IG/сO:c<3 I޸є# ()Ǒ:]qU?Ls 1)H±ZZiW{!#_"r^`2vdG3m=B&\/q*LvTl De.{ПA W3pe@4.-\Y1FJ7*6lLX:vcZ0}Yᖀ.#s6[b3b~syh^X$wvJ?UmVͷY5f|Tcł @@hf098)0VV"FF)/ #f˳q!sugiVO.ushw[}i2֡ Zg& *@īܓ2a-DY{t{BL۽[Kx`n6EQuеfPeljJZѳ Qs眕u CJ(y%]4rL5vn,Ќq07jTx pI|5鶒['L%Z0cIy_ % v!l$Q `6 C>:#&dޑ Uݐ dk,}Y!.@KI@̑|)CZ!Lo GdX9~,ٻuPHZ(L4DN0l (,R P ƙd4|zÃ4|7ca>W)SY[iY[G&SYںB^X(14`JZkUmA"l}(1lf-eD`^Z)' RΓaS}(OjL_noi{.#A2kjHVʱrYޔ2)˧)^V >;787_.Ogo̽9[,7fg]ۛc>pLg7k^vW9k~{Ǜ(_Kަp화r n$d0FE!ͭN>}4 ᨀ]񞼐&Z = x͈!Jr((#drR0h<]NGT ~[H=:i>La_+aKM0f^ wWc '?jmR~B}'%s_Ԡa@ZԚ**AdYHҶAԁګ R(1 58.sbYHz~3-a,Aw3LٻH[W} u!Xɖ / uݵZNS3iixdž"YE/!K'l?cJT W;Zp5r lyþD6KG̓ $盋Oce~yw͏Rq^Ke Z6g8n Ƹq&7|*:F;$`Xo9ݎdjtm UtNqq>:D:pAFQDiwmBKO)4 W:>E+BG.<Ǻ {X,[<\9Tb=a|/dE+s+C WsP vye#"`@a$fV']j]H{KpjSILe U9XeN>mEت'*̋ KuE*_'- *bS^1Ť˙mF"lzJ_~@.ft=$Á4KZѦ[}yv T:H=$"8-`r%hGȩ\ZqI Kt;[gUC'' WױN1cyցsM{'ehI|JlƷu/bG:HG)t pc5E&r Țkz].,yk/FO gqjEJZ !YCN<BekwJ{q43Z YWob/2UL.猬pRÁ;gyZQ3זywP?.W"OQbqR==,朥rCY9ۣ_i-쎍Ѹ2,f4'B4uG+*HR#0gҷdǬOѰ*!du|H(!qs){\9*K@Jj;[Wc@:vHf>Ɖ~$UU3sL7 b~,MtSG#=XV 1k5>{z1,7Ėԥyhi iy}ǥRfɑ׌~7.3E C٥44'QإͰAa'BҴ K7?^С6 r河cNt1,TؘBh`wéwңéVd9r{Lma|r Wj`1ht`(0q2>Q%oxOdEG~e7M5JT9_|CvyjJպޛjwKn=fw%;0РX6nJ9"Kd}ϰMlMMͪKZ?MAߎң$AD9۴5^xw 7su&w&Yr ]`YLuk}p{\8SL T*6rdXC&%nH{ G4_Tg p-[ e3L6ZHaN12~Ɔ3H}]Hp@'K@j\0 3$=v5ƨF b~l:^|$RDK@>OUjpF07 VRlӀLK쮞JAQV:;1GiV4Z,@}Y4恷iLĕ%e7b/E'PـGp<p~ fb1kdh| <\ ڐ1 Dөb"2fIVQ\ԠoL_b" |y.7.>$wdF-5.Ϝ/EVj4.yzQUx{!8_N88_D(ʉ j'+#9<ʡ {Pp\m h"ljbx.=0[P j.4-BKiiCu  瑱+fWPiNJjZzZur-}.ՄlOYKeS[qNJ Kk1-i坕.2Lp'`^Hp7IkݥRmɂl {yTBI-!ʴTdXTRiiC6 ->4_2,>Zj}&Z (:h ۠$yj٠zx{!:[j5s ~ؚMAȍpr@n_3ZKy}@F*$Κ:bM_$V}ȇY?u'E $A*Wf\u/lF|nH#HDMsM;hMu@!icƆC t&9ALn7P{ iӠAFL!#Zf^E j# ܴJLT_=s͉i@*Ԑzқ=6ZD,Cg (ibVJ$q)Nқ,POBfgB&*b],KSx9,;#XN~қL,ޔ<\s!cқ^|4}w97K(V"s,u/&T-1V|y|)J JomPQ܃`f?xқSrpҵ$^XT:DPėJo d \r]x@E`. 7T:qO;̖jh2T2-Pm o|ny$5 rʁ]9p @n'aPK66,A}\|"}9/vm\Q%he鏕wX'J~7Y?d|M[?oo* 3Yc,¤\APuLjjy-ύ.S`?b./w uXf,Glww} -sW`Iއ0?6&]ͬʞfn}rBR_ oo~]L^H&Ooþ۽(9ZK.= ѥ(XS7GWDCH,j!2EV1 zQ,hA> {4|_ovR"577~ lk c'Bt7Yy׷_/aBULVS,s׿|*fwHN'ē'0I߾<\ŵ[-_y}|e AugVZ0*EnÇޗ9d2 jLHFW]K})#&!P*1BE TeD8͓l=1LB)\yg^<%]!t ,gҬUM{ L,<*;BEVmȘBDbL+f,}*TaT~nR)OLcQ#`sZd\ˎO_!X>/>+j%\!`UHp3'27$=zG#8d,a2HАl0`OӬ"nѝ)^H0_AOv@=WN䈣O)$;]˚[Ødƻ-g$s6A4vPSa*?҈u2&IWč([S8Q0)lk'6ׯp TӓoxwtHy]W\p=Ep˗36Ҝf-ߏyIZv׋"Wcf]4Qouf >; p?ī>\ *Mv-h/nq3m|1!}PmϿ_OaX $ozj; Xf=VR9>!iupbr ap7oU2KpEe-B,MD#>  T[o+%_v׆6کWU077NI.AG8t1uR/A-oyZ/}ºF_%`풣LY GtCX{JԈMS< 0̏}H rUD-Ŭ6q$$5VaQk; ek5{^#uM1vk B5u+Yz$ȕY2Yp` Fc{ \9Qy]VNy% BV -F(F6!"߿ 1R=HZo B(D$(AX)Fs|M"-1D1 #;l;n"P8f$e(bcHcS湬uXИKkxpƈ1 R k,hozS<@ ћ;Y/Z:_~ي=PJX+<| H@DiDb=ѓ<Sc]ibAw6ʙچzׅnYN<)/>xIn KR[/3vo__$h7GP~]>,Hq8ktXQ:{t(*jym^7#'Y^7R8rB ( 22E8ù$Ȁ %P_sk, )B #r)>Q I3Ƌݐ U*d}Yg±\Go' dHY %G knYf@֯49_ڦ4j:Edy4R_BQ7c2H$&aD@wl Qnw; ]{D{<,f0.њ3iz"Mq{qw(-?rz~ڲ)o\ kV3*jӚ`;m%A1 Cbֺ97%6uT熼:gdi4ۊofXY⛻%ߊ((%"zU<Bn! ms 5NX/=bM˓A׽3iu<[g1rWo!jvS˵x1[s@~t6' .aN--j^ލ2k{M|8i (>x*Ab 53b66 0YXnلr:MG2O[3m`kchk-ِzbi2tY'-ɑ/wy7{5`6D U(&CN a2l%(rUߟjhة5fѓ>TK%ZST-f{!mMr*Xͮz<Ě7&K׋6ݚgs}7K(i2阍?_F12o~Gw6? dV}0@wI[#@quvgJw!t-U.,䕛hM@Ou'n>w˕A~w;btr4w1Nn%,䕛hM?sۻreb:ݎx:0wˇIn%,䕛hMdj#u3ɀ}]iO&%>V7OM{ԿNbqosOdM?ƎWύM"Ƅ9egWp3J91bG$Ba1"t9a.v@/r7W!D,'7S q:1!aq uF^b:ܼ%P$IIH+(vaJ3@)bP|R0Z@ PDM$ F,bG xmjF$̜_d؈}~B^J j. ;YSJjQnu>P>"bɑqXz);@>~ҷH1y,QQ߰ѳ\<|{S1/b-=D3ܲQSMר0IJ8=dC Z`U'>lRD 舽+2Hi Fޕ#n0A.{WޕM4Ǧ$R'Goy7[DUȔARz7:*/aPaRQ X+7,ӏ8@'+FvS'$zMMUiOUiaW6EP sVLZ^~וW3+x%^.~u6 Wa=~a>X0O2 oz&ʐ>V^t㥻|\a~1}{;tu$ю¶JX /JzXo1O)3C-,0~ gaJZP^)%k%_'EdGO@O%Nf/pĎɗa ȿkkrC3i>u"{" <;1_;w~4O|]Lnb2x3dùcy^IoֈՕm_lݭՁyR7hoa3_d+ǟ.s(7u1u`)/t/fS73sQ5׳*3wrf8PR28Ϙd5䪘K=<:/{4< q+ƏjHL8Ja|gGH/eU[ʊZG**bAUT8V{qߢt@5I&&1g$v[&**Xي [IkjfFW{qCaK)۩ʡHcԶ!u-7i o/m'jnp6^F9`|XNk)QFRjw~Uӑ#s4_Nm iKi)?n={A=zn$9+I[} ZJ UtLA/$g\< n0Z&Jhc,}CߎKo9ׇhzSYΦ=Ÿ};?F1Ώm߹zO+0ad g# ͬfRj#j -q?#??6yxw!D7zwj`mGӪzmEOm9 TH03,2V:EΓ.m p"V5 P !C9CƂlj\tm{B˓mg<|ZZ[:9ŘZS8KP F5w4K#YN m])ksUO1=jw743%d{HE_Coq_~'^M,0E`\ Nbt|կĸaf3#EJWtn7VgޟGYi`H[EF97f^am=q<c[C\V*+ ഭ>a[Dzeq ƙ4Tȝ^,b4#dOu!'r"8A`O1kOmӢm{m>mj(CkUSmOmdiq:"|]7C~jMF nVn)QXP>qР/b( ( 6! օ=`ԋiNm[%{9^_ mO6w,8=^_V  ŲMW_Ί xwBG;uڗ ٗ:EH2">V!I1E`r( sC=p #Uu [f%ʞM& it^tB"!'''~HOL1ֵmVjYm$ f8DŘSKbe|9V#` צQD`R"Me);T-?2d$Ȉ2܂؀A2`b¹bjp9k1CԊK"921U?7? xoOKzwg3bgjwʹVAfb$i,]3dTgj䶍_/ws@7^U/gRwIS*$Y/+Ϯ,)8˂Hh" i-'t `* )GF'pAړ-Y@NS: khH^0"j籤Dp *SFAz~DRJO#AT8&)*ݿ^h-:aCL49ahBX'w'BZHY[E ab}j TMJ8tVラB%\ĠcN_7lM:LW]g\}|δu؁^]O!B^܏blc}`s3R9Gs$vuOdJCKbנ],qʲXȅR:{)a$2.1GyeZc(sitxF߯EZ.^8-g^?r`|m^7zZZO< c|I01NA/DK qh% w?CQ9Pdyk9]'/σO&Tų)*}ه?U0Z\ӗ@v%G~1)JBB2-?f7&ymѯs} *v}o,u$mYTu/ *ǡr"@Tˏ?n6 QK`$L>y{BƵ_^}Xa\TR8UhNmA(`%`Oi*򹽄!vAfiٍ.6g#ؘZaڹئm$(<uLD2HRր^L&xuB ^ocAhkʰzZ:2YRn'@aL ]Vnr>lMNhɹB O\XT pA/1s(  luqJ#j1vG|aPP|: ݴқr(nӻOoR\&7 u VwGstI։+9V#NJTD{`ziPd0ӡ tۗNl-:L$t+#SV4T S1q$a=SHU펷UT1ǡ9 x-/zFJe]ɐVRtM"BהFU^da:ۂcHn!Q@qJ ԧaGZu䈈#~$X=j^#œ掳eQs'U' Fko$iSF3(Z2y[)%r#XB piTDNzGfJaÆċ14 -FJRL@qƊRDBpL70JNdJ Nd˲d8.Of ΁!- #8L!t唯9g#p7vp2(D xz* ;p^cVx*C^ScQO#i1 S,-"at䙶"ѨR.Kg*aBA )l4cʅE_'tx.vR?A-@Z>'*9U'SU^S$Q}ӥ;x)mڜ"%=|NFQ'k ,N> \?=1Y`j\lVl?7?#$.C\u;ErFNJH~Q)Adlzm硜n* >DS:P B_CWMtT!@lZ=ҳKGUk1܅ mӲzK-|%ux}G1`@J|SRo҄u]x%DMKܭ 7V+X6u oEr,qB4t{7m" ߛP/kz16kl^)wg!+h4:(=~w.6z4_=݆@W,LI>sǻH9~wpA[ʧn\m|ma*􆲥A83Xgd]#lǤnK7m$fɂ15D)4RkgRHj2HG,Z ~y(g5JJI}ݖZ 1JybMV|n)DӗRH"-ֶ JFYJXY@TV/j&X[j(R:.u[jR;Yt>.u[j%e8o"MC)4r: "MCi-u<ӺQ&$Rd@)i(ք_z7J W,Pz#r{(]éB(Oo0K{ݖ!*w#)CySl>2I: z~V8{`1j.I%q.®ISNJ0tP B4R.7_AX0g4OMG3EPb^*H נpΊ]gZNDzfHG\#PXFX=znjl%( JEaE LٰoVMqOۘ>]0H/6e#Obig{Zit G㑾x>)X I{ V9-sJhJQWk+JivmH[2ըHaHJ =eZKeriK" u!Cl]U Ӈ"DG#ڜbMZK/U. Wb2 eZkP%=J|cNU)'Nut4iV#NJN#{tYktr>``HR󧝭EաV78(H86F ؎DʂVat Ei3̺X 2 { E"~|. 3ȏ9.u[j x5<H i)D)4RkF.RL\HPļm1 HCi- gRPֹDLt-n_Pz~(HR\*Ț6yCqR$z3GHDXj"8IR:jU`^qDPrU9&8T&¾–TIMe]_F,UR"B50`/%Vi4⚓Ja~6\pTUH)JV<4$A:i*>ٻVF/]]vTʞ~SH-.ul$!h٤ٮ](v(4RBK&llFK@[# .UfPI#"#eiȒ7h2>z.QBzQ?1,&=M6l Ry7W+knHن>}5;V}Y;uJx}Bd" E;BFu֗YYG]ɷ!L鬞:l셹].^Z~d?JGq&8hL5[Nmsj J^DaQ`*kx/bXHOG ZbvZ# ު_Xsd B$q ^*$Jƌ9EQCrH3qS$mMtInBc뚜?\ y~}M9LS:OEh,y++t*ZC.k==ӫ<ոJnz87&"qWr󠋥$YeOe*⌱VESEl% *Q;ϮKn&ϟ~``͇/5;E%!CAUcWa+ OHnaorsfm[sk6*DƱ"&sZD`FNpᨊLRZ Hj^CuznJTs#:=,,٦p=NmJ}c ĽBc]Gf!hgg4C_L.W4j+료#qu})Yi?z:g_N\#ƙܐ8$sdJAȸv>ZkNx/9Ҵ{[81F*E0&` ¢Ny5*R'PƄμ,y_, t]>?w2*.k#\JDBL eXLjy+KH+!ĐM!i@+]"xACA'  J+I}px` jC"x24{wJP3_:>rAKϨF[i*T[Dƒ*3(-0(gFC9lԶn?ktGuvZuwՆx+T+KP7qޒ`XRkG}X:_rfa_\۷gQ!Z5? tlX}ߡS(RD1?IT**bHM.|~zd<1e48v{<6%Xl ݬJ7(L g<D"@kj7ك7(xae{L$*l,_ldwgxXHs`{ 5ߞu(gD6Vwן\| dzLܰϫ +ғ{[H[߷خ߬Sm].w3n?YU>'g!. 0*ȁv+c} 3z,k{ ?3) [c 11d6eG£ 4i*l6rD쌱|bW 0*h# b~3KCo"EuRKns9YEK&]E~7KVo76?gz۴N2E?D5]$8I tY@R":ѕkgS]{]X7q[3(tH*"b lXʲA|uxE4~y͛I,/'\SG ;Öa`c%bK miBGk5`K^~y~h5?# an.}ݶ=:c[|=|Aw<4?O0_~^|q:`9LiՍZn/9_[05-K㧻}ǰhNIBp$Syiݺb#:]QF4%d0ֽLvCBp$SPZڭy_ڭ+9%mL/un;fj>$ -lFT#, vhc#匰v$ʜP:h(h]\/,̗|\ wcMHIHyе|׋_B*"֕U٧L\IMlp'qTB'տ&V!\i =ƚ=?&Rˊ)+X%)EsޑtLleV$TU :F,x-hv9(F\t#  Xu/19N\(,e6cdJRBWxE%%/Ü.aF6Plyϟ$ -lTiΕ99>3trMliOWv^\Ǚyc7*x vYE4B*Tݪ~GOPnZ ݇+sm>֙o ec`aO6zNa뀾VA:C^tQב݈pVo>ۅ%Ҏ%OaK%}?`IqXUh *)Pr2!oSy#6Öƙ+'4RGX+L6,‘&g(g{8OfGIflF"yȗe&AmDV<W)jdޥbu.Ͼ>ù)ڦsSԘŏ2(c _-fnT:L5*A2slvsXI- \.kgk-*<1 Q;;֕ O٨Zy޻F(Z. Q;Q)>%i&h\B|g >AK$x3?4G&s 00(ImipLň%Ɩk`R bsJnJ!8)#H`Bpb/EʦFa`o}t0bİfGh!:\(hlȐl+p.c)Nj V}f_ZuFBe#έ&RMJF/%$/p =T;Œk%T^_'GZ*R0^X7v`6ˎHV2h}#5G^0Kg["#T 8LF"*OuI}=mԪrC׋d'mX*LQv.'%v,qBy>mv9k/Dm8:k/B6A|ꂨ- n ^J/HC Zc=E W`P@]DvG-Lu)F_Da czYHM^ix`8FʈJo@)޸! $3E6T?$3q}E-dlzKH%i>Ւ=hq~ (.%Y7jy!DH6/#^6\J] UMo9 *H-V) szBł4`yvVAD@ۯOQ< Lٔ3IЅSʯݼ|m QX{ fg~O3?T:3Ϛ^2?8*"9b)$e)I,ȗnظv&vfT ̚)*y|ǵ(!X VeAN5mDoxNàbe;Pz8}PTTj+^o T݂gmvd=kE,Z3?R-;y-b" àWf LRHy[h_{x'~S2iFk1nV{d'>Lnp>/hR2sY/5 W H$] *B H@'yF%brb9H4C`LmPN3THRA;iP3c 󌚑@ D,[okj Qᳬ?rQ`,A4p4SVGg'V2h9nٍH $ؚbT`8L*.^nT(CT+h Q9`Pj8J\.4&gT!Rzv]YPֿbJ[ά_'ӍhW 6@HIYߙ@NcRg4IYwI$H׬yRHe޿O1#EK"K$/LƷl%9zGIfpC)PA=S^WZC'8<6 (3,qoh%$)^Pu ]J`浡yK۳yy ۙR./3t'y/DqŖJq`%S;6|wGIrݟ4(pdFY(Sή*ace'8e'բM8__ *pq }뮦jwR\S5z\\F'5U8\KvAM^&SM 뼰lhCBAzsrWuL_۔rt6z3L4{7^6а`QC$UXfP3L1[Tv(6uXvw%8~ \Xmyf%[Nϛ-.. ޞ*xs./~ì8ׇKí^~=ziM\εrW;Vˋ0(ٓS/WRtk=8Lƹ(ve6o}ͥg<WaZcN# ELiv'sVo|@֖%MZ[̒Bօ|"J\PA}GvH/_nX [2@+ud!s{|Uin ojK3]1,z*3~rܡӵV *sc{aj([6yw񿿼&IZ>Vaoo_T㣬:_ +KmQ^$a'=W.iG*X(fW sͮYyi_ͮB(c[C6]'ۛFDf&ano >)J;Ҍ<[Ifm3^r"il*}r\ß\IRs?.&7?OoV~y3+{RzθL>dctmo:ⅸ#G,^D9Kv_-oߺp`4=Lr) Ħ B'<ӷAM)Z ,(P;f0&v9ԗϠFqVا4k*BJA} MRzR S`>8ɋ(oFnΡpZjytMߩ8|2°׺S#-ΖJVLQL5,SûXRsO@>C&"r%TUX.}A~9N HQ+34ucVFE2(JʄsFèq rׄDm >5%zojMU9DĖ@ * QI !#3Q(_݉XnjL5...m75vS8"NK{,M0G.C58[.Es[c911 Q+:ˈ]}~Z4X U,X4Ib@QḱJ ǫh2d^P1Pڊ(Y_Ί1jA, Q`#NhT28vT@E71ĄݡFT@a}#Zs IP]&Tg~#꣐Rˤt5jERzR 8RJnrj$4zR X&L=LU`m'n/$`h@PAQëyB˽kCm D;%jYyDG@IN*$@/-.ni@=o^Hjp(m:rZ\)P~$U,3RDcDj;W ~Zs91^;ϋ˛qh N=RJy{(jopOfd_i|g^`=qImh7 g4⁦0s7 /AAYK'>AT݃?& V$CLoDY%rXjcSU6UΙ8o8!9Y`vQ慥D Qj@ @DU@jt`@4aB`m18"I*!dfAC$sCHLgQF킡茣D|Wm - Sʇj~͛|;3O_ YwfT~|\=(Kv }N| qw #LVWCO$_K^x2]/[뾓4'/wĉTa4MѭLXdio^Oytu=(s?=Fiغ2b|z@>?/ E)aȹK(RFM(>V1mo]Mr*J1, a }-0kXR״͸Ulav/冾FKVd0 #^d?O3z&lV*g*L'uB+%jeG?ռcdL 97m+ lpn k.wrNr uO[8oٖv6Acgv|~fOW)v͞o$YRgBFzKZ()q_AB $lP)(;KuO #=rأdddd뒽X Q #` 61 2j42Od XE{_ޑ_qdc*ھb.,$lyjhYϜ~1}뗯H-q:\r.MU, YrȪmUrC} j^ড়( [r7TRkܦSў]s@hI舕U.8rYQd;).kଣ<5%(>Iy!a@hLj/*S\C-Aۛ,s:(m m1\0pu|$j-=54kC̓㗞Z40+; ̊86~=aR㲆9&ȶζocny&8C O}`#"к}^&C؟+%4^†āBJ"znw|+dk :1=D*]8Iۧߞw u<HeJ u* $0"0Bm]*KVXۥR{.v踷90[=1i!O3XC ԴjG1t;ԗϠƚ}nbp2)e2? )*(iK)2)ϚVRJ9Iiz[,'-Pt\Y㱒4YSIsE]5C797C?ܙ?o09.;W>}:bUE^]!Y%Hķӷd^E0 lVZ+BL-?Qh5WCMQhΠFYo67t>71(#Qz9 \Ph":(niNKPԎfуO4P KF "$^d&)S9b"cgWfc:Up:zk+ F'#`*"4RIBgOתcC0ӭK彈(&!Z[Iɱԕ5'N 8 w?ܓ;sc-أ] 'F$ ${*VĤ+R{is9/uQ;*wwayb1YWbY!^ $ ЛM~|v %?ڊO7C"⡒h+d E +Y2ٛx=]̏ NW'}jQq+}HUդr1=EhgS.;o/vҧSqȵQ#mM".2ɝĘ m' oo}5>$Mϒ]P3${QـS(LY(pa?6޾p9]ֿ|8TɦS*YJG$ULvTW !Cv+@N}2k#Dޕ$"bTG40[,vy4%%o$)Y$MQIh6A-Ū"" w$wZV3Qbkop'FNw^)S|G2X-L&vr5w ST{xۧ.U$z{(sq)Qtn8|C@d@^  S Vs9=aVqϦR(wr(zkY3hb(rw˶:L`_nn5U!S3>AOH3rՉo6_)ݳO/TQr[=Pdp' O>pq<̧dV;ܞBOҶX4DׂDI^2dN%/|W%i͕+\m]w#;@KR"Ou`!G=9bnnX ched.Q1 XtW,z ߒM[t9Z6H`wW˾TYˤi2æ::i$cyB>RcD䦈o><=KܜiM)m`2;5>u22n}A6Qlܥ8D4=%4̴E+gNFԼϟoɦ6,Q ʱ=`V;".SLH%GKթ|^WIFN6qc7Ht*,*+˵tq(ӊXa-9g,{J6fA#W'4Jf5ax5{Kwg9 3z&6^g4>jԔNLtDu Zw],4Z9V? /5kA@ cOu{'Mpc z.uGem臰!Դ@ր#wZtxnȗj)|XT.p',Rm{Jp-jrcݜ܉y8~ܨן9 i1h!SX+5a5 68^K2w˗e;`G\7m\I /&[۠+1:džM47+qUH,1g`?s#( .Kf-&KHǧjS@7a+Lݢɤ|VKQCca`dKF+whv<Ѽ˃MFuE6ђ4rw_NV2D`lP9n<^~k=eᡥ ս!^]_(+ ϑ1HBqxT'ap;M%0}T%Uir*)X/z&l.d)^ݸ IG!: S&A,Td_D&d=cQBk>9^ͦ.v!?hW~0ypw.)%I(p2|_8 79ㅻ,XB}PXYn,ONs4aM^Wc^0`5' T Ʋi=tst*8Ϝ!@6c*(r,K uj)qAjI.&Z)*5]FM[XCvFKbI0NG1Y~oXtۦ,hM{p+k6WCZ-XmK4=tSkk \J*M|PN[ZX [ZYŽP BuP) [ZbiQґ7|G`5_Pr Ъe\O6“=n[zcVӓǪ 1B08vzqv &:ƹ BIvcI9{mݘnHwb_6*W" "Jєf.NG}!'/c$,I,n"~XX\+yIW .z⮅p/u/dY)+<ϴX%$@9K;iVr+CBi_;ù>YZq{f/ i[RmZ:O}Rۨ㈪}0IkܾHuGCJ)M&P 'cR wQKWm?E' lJ A""{B!v_FP&ÍsF<3n=F(${w`DȕҰ=-uЈB}\VT[\ZQ sKm~8p1hBe`Pzfl BgϥNV܁Y͞3 3ᐒ 252Xi/zWfb FH.-VGOj9:%>1!Ex(!SJQf-Uϟ^OS%ZzW KKcPK&J ]WGڅGU[c~>"zJ\kȕ6'[sdɔx#4(Z[.7).k?E:#wd5K8O|DEٵ<;j]Jyh@Os-Cg}SgLKv?V.` 3.RK)g+f{;b"1_~ }upp=ɫ=}ř盢3mw/`i)? +DǛ£t>: U!+TH̀*)2|~y8~xtF6=-W6)% 䃗m8rXD5$.)/A@h,9evjON6+y٘T[`'TBNFsjZS:q;>YCRh `L+#wBK13BKamUKPs^LQ2w [^bI"%re8 k#8hGf'5l FL1`r%.&e))YP&RDH] A ܓ$KB/J&c^I-fJ3{6׫1;dҪr O>]=jKƼ@v@i4%kYE}Eo,9 LwŽE{98OiFjn4GC= iѦ1I?/x_Ig)k3G.3!X.D/P@§TٻHn# p f,V!9;v|qh6wȹuyz^h$VwX,W\UJvZ˥]@B]_ЂڧAڹT\rs`ANIF bdc:0r8コ:%ݤaqx.V]->O],>IFǼ{|'l.`kO)(~ \Ɇ{=#bz86"DA ӆ1·j*P1PؒjQ[i*A(P+\$ʭB#<U)'%TT\q%t] %/?_26( ' -cA(JVGIY TlvF2gw#hj,iӒ5fcNݶdPU[WH^oSjjRP(CY^aɅ'bI,~lym(=f9 <Xpp+ _[G@7a0a@U3硓(P{<ԑ<3f +!/liTd-R)Kàֵpgar1J2ny-K€硉vN8-^q[ĂHj˺!=6-$ !'UB҂.gc,ПbV!(M-*ʐS0ߨqSC8z{y5HpMMB﹡8DѮxZ{S3n5X;GM3[^J׺dzuVvSCd,+X9lakomg/H^ƋNJW6į7#7jkaf `3CXG*zWHNmS eu& ~vhtuey~1'ރ $+ Ǫq / Ǽ$H4ԚCFHÀژx%7BoR\QzR c-'ZkGxGÙpq,mTE}̑ȒYaaq'oW'ƻə#~L]L}gsY)phߓB"Frh;E[8V{xUjJf`H)3\6t~0]T*4JlUlũy2ByqF}IYl6Vm!O`)N嚑i;AK`%IK%ګ|tkR/wG_=5|~3Xw}=b{M˔?g߳Ҟ4j*S֑r)b'FX7dKy -16e .[>FuBC^f锂8'ary:Hn"L}[|HօrmSacׅŚ yÑP@ZєSߥUB:{И؈}9Y XT~[lUe"wT Khd9>\W׏+&|ӏtoI/Tb養0G( S(/Hs)SH>.M;K$RK/H:4P{%`j444ORH6 F -qTf ZˤFyEJH.4 NB 43\#!;PDHNzbtjK(m!lɔV ehp tbZuբЕfBJce_bM.Q{vS.3afY7I3qj%9 造0UN 3(9( Y{\PQ" ШTY5ƯVRJhHYռbPך|bFd _^fQT0b3Z1D2L(ЕA#KS+[$jWW$ʸ(75pQ`s0t51$OW|jՖ_#1;8ü!i;\1a{I2e;\m)Ga(6{Ē0iJ9V=RF\]E)xđ }Z(].@cgw( 5#%HOZ)XgSX)SY6p_cb!lJMe\ryRچO1b'f`êJx4jO`hI\1JU ;krEC*\bDAhČP&`pR 0/E~0ܛWg rҊ +;j 5h}36$!rkL.hOV4$wب&1Hp2F[4O$6MU4K}fN55QHBM2YڳX4䕫hNd8B6dLLDARf~# 5u[D4<jjv{,jjUNi#7u~ W0 q9Z|xnCc](M8u:4],yuuQ~RIL?G8ޅH3\fϗV>pU0)\L>/~>Xܝ{A͘/?wΠ,ʻY_ϥr^/wz[O|dӈ"D0@ ˞ͣj*K850I$+Q|3þͶ,D,J՛·P$&So ߗ9j. 8i#˛z^+~>ߝ{> eʹn?M]jilZ'?}cc^®H$c||<Ė&#I%I#p7.9\,fRk}"uwcΘ!@X2٬ 靽oBMnB媺ُU9z P.5ݥ݆6Ͳ8t n@7Kh:5-Y9_dDD/k混vѽoC(C3/-uweo$qNaɨFuJ?Loi% $p?x =a# 3F\v9<7 -Vʒ+p&W HC5\Jc9ƪ*]@#"(fLi'QIM!+xUdA.%Jyp.ОnbOJ3'-r5(SGdgL}d*dJ"Wҋ"@cBO);!``7D:r -"c/"i ]#\+lm{i;r֚C-5;nx. R5#E#-r*a IOC,!YQAL,AUG?Q;K@~v#lQ&_`<,\8fLݤ1yW9zDm8 map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 10:54:18 crc kubenswrapper[4695]: body: Mar 20 10:54:18 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:55.127891883 +0000 UTC m=+12.908497486,LastTimestamp:2026-03-20 10:53:55.127891883 +0000 UTC m=+12.908497486,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:18 crc kubenswrapper[4695]: > Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.054472 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874557ea6134 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:55.128025396 +0000 UTC m=+12.908630989,LastTimestamp:2026-03-20 10:53:55.128025396 +0000 UTC m=+12.908630989,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.059291 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:54:18 crc kubenswrapper[4695]: &Event{ObjectMeta:{kube-apiserver-crc.189e8745e9a61722 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 10:54:18 crc kubenswrapper[4695]: body: Mar 20 10:54:18 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:57.573023522 +0000 UTC m=+15.353629085,LastTimestamp:2026-03-20 10:53:57.573023522 +0000 UTC m=+15.353629085,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:18 crc kubenswrapper[4695]: > Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.063568 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8745e9a6cf51 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:57.573070673 +0000 UTC m=+15.353676236,LastTimestamp:2026-03-20 10:53:57.573070673 +0000 UTC m=+15.353676236,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.067491 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:54:18 crc kubenswrapper[4695]: &Event{ObjectMeta:{kube-apiserver-crc.189e8745ec1b0314 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:54:18 crc kubenswrapper[4695]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:54:18 crc kubenswrapper[4695]: Mar 20 10:54:18 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:57.614240532 +0000 UTC m=+15.394846165,LastTimestamp:2026-03-20 10:53:57.614240532 +0000 UTC m=+15.394846165,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:18 crc kubenswrapper[4695]: > Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.071322 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8745ec1f3083 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:57.614514307 +0000 UTC m=+15.395119900,LastTimestamp:2026-03-20 10:53:57.614514307 +0000 UTC m=+15.395119900,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.075881 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8745ec1b0314\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:54:18 crc kubenswrapper[4695]: &Event{ObjectMeta:{kube-apiserver-crc.189e8745ec1b0314 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:54:18 crc kubenswrapper[4695]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:54:18 crc kubenswrapper[4695]: Mar 20 10:54:18 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:57.614240532 +0000 UTC m=+15.394846165,LastTimestamp:2026-03-20 10:53:57.621614942 +0000 UTC m=+15.402220525,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:18 crc kubenswrapper[4695]: > Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.079607 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8745ec1f3083\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8745ec1f3083 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:57.614514307 +0000 UTC m=+15.395119900,LastTimestamp:2026-03-20 10:53:57.621668273 +0000 UTC m=+15.402273846,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.083060 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e87432eb15afd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87432eb15afd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:45.846487805 +0000 UTC m=+3.627093368,LastTimestamp:2026-03-20 10:53:57.988053922 +0000 UTC m=+15.768659525,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.087397 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:54:18 crc kubenswrapper[4695]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8747ac07d3f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:54:18 crc kubenswrapper[4695]: body: Mar 20 10:54:18 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:05.129176048 +0000 UTC m=+22.909781611,LastTimestamp:2026-03-20 10:54:05.129176048 +0000 UTC m=+22.909781611,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:18 crc kubenswrapper[4695]: > Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.092021 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8747ac0978b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:05.12928376 +0000 UTC m=+22.909889333,LastTimestamp:2026-03-20 10:54:05.12928376 +0000 UTC m=+22.909889333,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.100003 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8747ac07d3f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:54:18 crc kubenswrapper[4695]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8747ac07d3f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:54:18 crc kubenswrapper[4695]: body: Mar 20 10:54:18 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:05.129176048 +0000 UTC m=+22.909781611,LastTimestamp:2026-03-20 10:54:15.129063559 +0000 UTC m=+32.909669142,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:18 crc kubenswrapper[4695]: > Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.104922 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8747ac0978b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8747ac0978b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:05.12928376 +0000 UTC m=+22.909889333,LastTimestamp:2026-03-20 10:54:15.129157241 +0000 UTC m=+32.909762804,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.109991 4695 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e874a0033d699 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:15.131281049 +0000 UTC m=+32.911886622,LastTimestamp:2026-03-20 10:54:15.131281049 +0000 UTC m=+32.911886622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.117726 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8742d304994b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8742d304994b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:44.308439371 +0000 UTC m=+2.089044964,LastTimestamp:2026-03-20 10:54:15.257867302 +0000 UTC m=+33.038472865,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.123825 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8742eae38b57\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8742eae38b57 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:44.708926295 +0000 UTC m=+2.489531848,LastTimestamp:2026-03-20 10:54:15.433587262 +0000 UTC m=+33.214192825,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: E0320 10:54:18.130106 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8742eba0fcf4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8742eba0fcf4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:53:44.721341684 +0000 UTC m=+2.501947247,LastTimestamp:2026-03-20 10:54:15.446448901 +0000 UTC m=+33.227054464,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:18 crc kubenswrapper[4695]: I0320 10:54:18.819255 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:19 crc kubenswrapper[4695]: W0320 10:54:19.317720 4695 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 10:54:19 crc kubenswrapper[4695]: E0320 10:54:19.317847 4695 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:54:19 crc kubenswrapper[4695]: I0320 10:54:19.819749 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:20 crc kubenswrapper[4695]: W0320 10:54:20.591686 4695 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 10:54:20 crc kubenswrapper[4695]: E0320 10:54:20.591737 4695 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 10:54:20 crc kubenswrapper[4695]: I0320 10:54:20.818744 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:21 crc kubenswrapper[4695]: I0320 10:54:21.819563 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:21 crc kubenswrapper[4695]: I0320 10:54:21.886062 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:21 crc kubenswrapper[4695]: I0320 10:54:21.887275 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:21 crc kubenswrapper[4695]: I0320 10:54:21.887314 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:21 crc kubenswrapper[4695]: I0320 10:54:21.887324 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:21 crc kubenswrapper[4695]: I0320 10:54:21.887803 4695 scope.go:117] "RemoveContainer" containerID="f9c109f2e8f6c7345a96092d99d036522e153a3ddc0b7dba1d47765605dcf44e" Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.128003 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.128231 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.131619 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.131665 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.131675 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.819628 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.970468 4695 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:54:22 crc kubenswrapper[4695]: I0320 10:54:22.996373 4695 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:54:23 crc kubenswrapper[4695]: E0320 10:54:23.024358 4695 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.062848 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.065341 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc"} Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.065605 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.067012 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.067145 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.067232 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:23 crc kubenswrapper[4695]: I0320 10:54:23.820668 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.070206 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.070858 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.073245 4695 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" exitCode=255 Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.073285 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc"} Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.073328 4695 scope.go:117] "RemoveContainer" containerID="f9c109f2e8f6c7345a96092d99d036522e153a3ddc0b7dba1d47765605dcf44e" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.073504 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.074619 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.074661 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.074677 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.075346 4695 scope.go:117] "RemoveContainer" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" Mar 20 10:54:24 crc kubenswrapper[4695]: E0320 10:54:24.075626 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:24 crc kubenswrapper[4695]: I0320 10:54:24.820502 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:25 crc kubenswrapper[4695]: E0320 10:54:25.031510 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.034789 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.036097 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.036137 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.036154 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.036187 4695 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:25 crc kubenswrapper[4695]: E0320 10:54:25.039206 4695 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.077268 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.128625 4695 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.128721 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:54:25 crc kubenswrapper[4695]: E0320 10:54:25.130518 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8747ac07d3f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:54:25 crc kubenswrapper[4695]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8747ac07d3f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:54:25 crc kubenswrapper[4695]: body: Mar 20 10:54:25 crc kubenswrapper[4695]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:05.129176048 +0000 UTC m=+22.909781611,LastTimestamp:2026-03-20 10:54:25.128691756 +0000 UTC m=+42.909297329,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:54:25 crc kubenswrapper[4695]: > Mar 20 10:54:25 crc kubenswrapper[4695]: E0320 10:54:25.136128 4695 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8747ac0978b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8747ac0978b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:54:05.12928376 +0000 UTC m=+22.909889333,LastTimestamp:2026-03-20 10:54:25.128792488 +0000 UTC m=+42.909398071,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.521756 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.522027 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.523430 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.523470 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.523485 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.524119 4695 scope.go:117] "RemoveContainer" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" Mar 20 10:54:25 crc kubenswrapper[4695]: E0320 10:54:25.524333 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:25 crc kubenswrapper[4695]: I0320 10:54:25.822465 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:25 crc kubenswrapper[4695]: W0320 10:54:25.904576 4695 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:25 crc kubenswrapper[4695]: E0320 10:54:25.904628 4695 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:54:26 crc kubenswrapper[4695]: I0320 10:54:26.819150 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.571928 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.572127 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.573359 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.573410 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.573423 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.574130 4695 scope.go:117] "RemoveContainer" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" Mar 20 10:54:27 crc kubenswrapper[4695]: E0320 10:54:27.574556 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:27 crc kubenswrapper[4695]: I0320 10:54:27.818410 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:28 crc kubenswrapper[4695]: I0320 10:54:28.820121 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:29 crc kubenswrapper[4695]: I0320 10:54:29.820783 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:29 crc kubenswrapper[4695]: W0320 10:54:29.832175 4695 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 10:54:29 crc kubenswrapper[4695]: E0320 10:54:29.832257 4695 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 10:54:30 crc kubenswrapper[4695]: I0320 10:54:30.818465 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:31 crc kubenswrapper[4695]: I0320 10:54:31.819280 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:32 crc kubenswrapper[4695]: E0320 10:54:32.036671 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.039793 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.041179 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.041209 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.041219 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.041245 4695 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:32 crc kubenswrapper[4695]: E0320 10:54:32.044673 4695 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.132698 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.132879 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.134048 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.134160 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.134233 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.137540 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:54:32 crc kubenswrapper[4695]: I0320 10:54:32.819527 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:33 crc kubenswrapper[4695]: E0320 10:54:33.024835 4695 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:33 crc kubenswrapper[4695]: I0320 10:54:33.099295 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:33 crc kubenswrapper[4695]: I0320 10:54:33.100452 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:33 crc kubenswrapper[4695]: I0320 10:54:33.100685 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:33 crc kubenswrapper[4695]: I0320 10:54:33.100827 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:33 crc kubenswrapper[4695]: I0320 10:54:33.820204 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:34 crc kubenswrapper[4695]: I0320 10:54:34.729148 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:54:34 crc kubenswrapper[4695]: I0320 10:54:34.729651 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:34 crc kubenswrapper[4695]: I0320 10:54:34.731423 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:34 crc kubenswrapper[4695]: I0320 10:54:34.731471 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:34 crc kubenswrapper[4695]: I0320 10:54:34.731483 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:34 crc kubenswrapper[4695]: I0320 10:54:34.819117 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:35 crc kubenswrapper[4695]: I0320 10:54:35.819182 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:36 crc kubenswrapper[4695]: I0320 10:54:36.819203 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:37 crc kubenswrapper[4695]: I0320 10:54:37.822750 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:38 crc kubenswrapper[4695]: I0320 10:54:38.820208 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:39 crc kubenswrapper[4695]: E0320 10:54:39.042449 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:54:39 crc kubenswrapper[4695]: I0320 10:54:39.045669 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:39 crc kubenswrapper[4695]: I0320 10:54:39.046962 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:39 crc kubenswrapper[4695]: I0320 10:54:39.047014 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:39 crc kubenswrapper[4695]: I0320 10:54:39.047031 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:39 crc kubenswrapper[4695]: I0320 10:54:39.047058 4695 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:39 crc kubenswrapper[4695]: E0320 10:54:39.051687 4695 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:54:39 crc kubenswrapper[4695]: I0320 10:54:39.818477 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:40 crc kubenswrapper[4695]: I0320 10:54:40.818565 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:41 crc kubenswrapper[4695]: I0320 10:54:41.820692 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:41 crc kubenswrapper[4695]: I0320 10:54:41.887163 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:41 crc kubenswrapper[4695]: I0320 10:54:41.888759 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:41 crc kubenswrapper[4695]: I0320 10:54:41.888819 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:41 crc kubenswrapper[4695]: I0320 10:54:41.888834 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:41 crc kubenswrapper[4695]: I0320 10:54:41.889629 4695 scope.go:117] "RemoveContainer" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" Mar 20 10:54:41 crc kubenswrapper[4695]: E0320 10:54:41.889836 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:54:42 crc kubenswrapper[4695]: I0320 10:54:42.819799 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:43 crc kubenswrapper[4695]: E0320 10:54:43.025112 4695 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:43 crc kubenswrapper[4695]: I0320 10:54:43.820386 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:44 crc kubenswrapper[4695]: I0320 10:54:44.820374 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:45 crc kubenswrapper[4695]: I0320 10:54:45.821127 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:46 crc kubenswrapper[4695]: E0320 10:54:46.049144 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:54:46 crc kubenswrapper[4695]: I0320 10:54:46.052273 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:46 crc kubenswrapper[4695]: I0320 10:54:46.054623 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:46 crc kubenswrapper[4695]: I0320 10:54:46.054692 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:46 crc kubenswrapper[4695]: I0320 10:54:46.054707 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:46 crc kubenswrapper[4695]: I0320 10:54:46.054750 4695 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:46 crc kubenswrapper[4695]: E0320 10:54:46.061968 4695 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:54:46 crc kubenswrapper[4695]: I0320 10:54:46.820812 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:47 crc kubenswrapper[4695]: I0320 10:54:47.820260 4695 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:54:47 crc kubenswrapper[4695]: I0320 10:54:47.826050 4695 csr.go:261] certificate signing request csr-f8r42 is approved, waiting to be issued Mar 20 10:54:47 crc kubenswrapper[4695]: I0320 10:54:47.838406 4695 csr.go:257] certificate signing request csr-f8r42 is issued Mar 20 10:54:48 crc kubenswrapper[4695]: I0320 10:54:48.007560 4695 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 10:54:48 crc kubenswrapper[4695]: I0320 10:54:48.667217 4695 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 10:54:48 crc kubenswrapper[4695]: I0320 10:54:48.839967 4695 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-18 11:50:42.334396204 +0000 UTC Mar 20 10:54:48 crc kubenswrapper[4695]: I0320 10:54:48.840021 4695 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6552h55m53.494379996s for next certificate rotation Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.026107 4695 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.062718 4695 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.064361 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.064409 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.064426 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.064558 4695 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.074222 4695 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.074613 4695 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.074702 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.079681 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.079731 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.079750 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.079774 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.079795 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:53Z","lastTransitionTime":"2026-03-20T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.094554 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.102723 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.102802 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.102815 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.102842 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.102856 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:53Z","lastTransitionTime":"2026-03-20T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.114747 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.125980 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.126022 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.126036 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.126059 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.126086 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:53Z","lastTransitionTime":"2026-03-20T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.140253 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.150148 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.150204 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.150218 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.150240 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.150255 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:53Z","lastTransitionTime":"2026-03-20T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.163178 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.163301 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.163332 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.214206 4695 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.263950 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.364266 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.465137 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.565994 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.666867 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.768081 4695 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.821006 4695 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.842604 4695 apiserver.go:52] "Watching apiserver" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.848568 4695 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.848989 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.849474 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.849582 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.849707 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.849731 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.849805 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.849925 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.850087 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.850281 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.850344 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.856685 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.857339 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.857404 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.857792 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.858620 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.858695 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.859249 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.862233 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.862392 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.871216 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.871257 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.871270 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.871290 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.871303 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:53Z","lastTransitionTime":"2026-03-20T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.889119 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.903796 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.917456 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.930329 4695 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.931277 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.944437 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.949631 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.949699 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.949748 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.949875 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.949973 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950030 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950087 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950138 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950203 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950253 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950303 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950349 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950388 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950471 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950525 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950576 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950634 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950690 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950738 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950784 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950828 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950145 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950343 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950384 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950554 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950509 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950843 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.950875 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951073 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951106 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951134 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951160 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951186 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951209 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951230 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951253 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951273 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951294 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951317 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951338 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951364 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951374 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951386 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951476 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951502 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951495 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951523 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951523 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951622 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951676 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951730 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951782 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951836 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951890 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951983 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952044 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952102 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952163 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952231 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952346 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952403 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952456 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952505 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952566 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952628 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952689 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952753 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952822 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952881 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953187 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953262 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953320 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953430 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953478 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953526 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953586 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953634 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951522 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951565 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951691 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953829 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953893 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953991 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954049 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954110 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954167 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954220 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954268 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954311 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954345 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954385 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954435 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954473 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954508 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954541 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954575 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954610 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954648 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954688 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954729 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954763 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954797 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954831 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955186 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955252 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955306 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955358 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955421 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955475 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955529 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955587 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955631 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955675 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955731 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955779 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955833 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955892 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955977 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956036 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956098 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956162 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956209 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956260 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956306 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956360 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956410 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956459 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956511 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956567 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956644 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956699 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956747 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956800 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956852 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956902 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957000 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957063 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957099 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957136 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957173 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957213 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957247 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957282 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957317 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957352 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957424 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957461 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957504 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957557 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957607 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957658 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957709 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957753 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957805 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957854 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957903 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.958858 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.958966 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.959018 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.959085 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.959150 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951712 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951818 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.951868 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952063 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952145 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952235 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952474 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952501 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952638 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952808 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.959749 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952819 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952846 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.952930 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953125 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953136 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953160 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953256 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953319 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953542 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953528 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953759 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.953928 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954129 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954148 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954176 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954865 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.954941 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955166 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955245 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955356 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955417 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955718 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955818 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955894 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.955966 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956013 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956087 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956168 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956589 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956584 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.956942 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957113 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957210 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.957565 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.958284 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.958530 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.958622 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960021 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960288 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960494 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960528 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960740 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960868 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.960924 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.961189 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.961507 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.961617 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962308 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962301 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962360 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962544 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962599 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962810 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962861 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962860 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.962977 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.963527 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.963655 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.963936 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.964239 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.964654 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.964806 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.965109 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.965285 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966167 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966313 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.959402 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966440 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966519 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966580 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966640 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966695 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966753 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966814 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966870 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.966966 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967020 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967078 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967133 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967188 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967241 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967291 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967343 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967394 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967608 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967679 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967731 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967783 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967837 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.967892 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.968318 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.968437 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.968895 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.968971 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.969237 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.969418 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.969424 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.969852 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.970202 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.970636 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.971015 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.971322 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.972120 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.972609 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.972645 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.972653 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.972968 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.973237 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.973417 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974107 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974332 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974576 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974656 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974714 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974777 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974836 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975039 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975092 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975116 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975143 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975174 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975204 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975226 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975251 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975234 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975272 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975478 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975510 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975535 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975559 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975587 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975613 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975642 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975707 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975736 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975762 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975796 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975828 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975853 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975881 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975922 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975952 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975977 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976003 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976026 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976050 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976103 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976210 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976229 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976243 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976257 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976272 4695 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976285 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976298 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976312 4695 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976325 4695 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976337 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976348 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976359 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976370 4695 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976381 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976392 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976404 4695 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976417 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976431 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976443 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976456 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976469 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976481 4695 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976494 4695 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976510 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976521 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976534 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976548 4695 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976559 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976569 4695 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976580 4695 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976590 4695 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976602 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976613 4695 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976624 4695 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976635 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976648 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976661 4695 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976678 4695 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976689 4695 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976701 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976715 4695 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976728 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976740 4695 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976757 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976774 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976788 4695 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976800 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976813 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976824 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976837 4695 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976849 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976859 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976870 4695 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976880 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976893 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976917 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976929 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976944 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976959 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976971 4695 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976982 4695 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976996 4695 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977006 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977015 4695 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977026 4695 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977036 4695 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977073 4695 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977087 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977099 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977113 4695 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977127 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977148 4695 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977164 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977180 4695 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977193 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977203 4695 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977217 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977228 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977239 4695 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977252 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977265 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977277 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977288 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977298 4695 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977309 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977322 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977333 4695 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977345 4695 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977357 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977394 4695 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977405 4695 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977417 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977429 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977439 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977452 4695 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977463 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977472 4695 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977485 4695 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977495 4695 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977508 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977520 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977531 4695 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977545 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977555 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977565 4695 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978192 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.981149 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982095 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982149 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982160 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982202 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982217 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:53Z","lastTransitionTime":"2026-03-20T10:54:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982751 4695 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.983692 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.974655 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.975688 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976109 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976633 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976643 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976739 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976767 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.976785 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977303 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977431 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.977472 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978113 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978231 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978268 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.978298 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978440 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978515 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978882 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.978983 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.979464 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.979468 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.979733 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.979176 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.979938 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.979944 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.980416 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.980859 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.981578 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982098 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982661 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982742 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.982985 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.983049 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.983073 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.983186 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.983550 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: E0320 10:54:53.983687 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.996767 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.997034 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.997240 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.997508 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.999047 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.999113 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.999380 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.999580 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:53 crc kubenswrapper[4695]: I0320 10:54:53.999805 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.000576 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.001022 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.001314 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.984003 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.984292 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.984463 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.985173 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.985895 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.985944 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.986378 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.986395 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.986408 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.986480 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.986730 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.002287 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.002337 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.002363 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.002428 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.986808 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.988118 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.987880 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.988333 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.988660 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:53.989557 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:53.989680 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:54:54.489630823 +0000 UTC m=+72.270236576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.002676 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:54.502643179 +0000 UTC m=+72.283248752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.002955 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:54.502903676 +0000 UTC m=+72.283509239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.003101 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:54.503089221 +0000 UTC m=+72.283694784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.003248 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.003384 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.003579 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.004939 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.005045 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.005208 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.007117 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.007946 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.009241 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.009268 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.009282 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.009343 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:54.509323396 +0000 UTC m=+72.289928959 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.010249 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.012771 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.012801 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.012867 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.013651 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.013166 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.014374 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.014443 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.014834 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.014805 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.015094 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.015980 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.015730 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.016019 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.017889 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.022664 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.026610 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.030306 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.041409 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078614 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078660 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078705 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078753 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078778 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078800 4695 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078812 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078823 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078815 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078836 4695 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078961 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078975 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.078989 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079003 4695 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079014 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079025 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079036 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079047 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079058 4695 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079069 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079080 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079092 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079104 4695 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079117 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079130 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079143 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079157 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079170 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079182 4695 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079194 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079208 4695 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079219 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079232 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079244 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079257 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079270 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079281 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079292 4695 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079303 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079316 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079328 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079343 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079355 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079368 4695 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079379 4695 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079390 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079402 4695 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079417 4695 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079427 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079439 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079451 4695 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079464 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079476 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079489 4695 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079503 4695 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079515 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079527 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079537 4695 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079547 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079559 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079571 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079585 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079596 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079608 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079620 4695 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079633 4695 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079645 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079656 4695 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079712 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079723 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079733 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079745 4695 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079757 4695 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079769 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079781 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079794 4695 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079807 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079822 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079833 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079846 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079864 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079878 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079889 4695 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079900 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079924 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079935 4695 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079945 4695 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079958 4695 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079968 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079978 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.079991 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.080002 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.080014 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.085929 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.085970 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.085981 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.086007 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.086020 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.170865 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.176937 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.184060 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.188758 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.188793 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.188805 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.188821 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.188831 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: W0320 10:54:54.191473 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-7c250af9fee8d6ec033a932ceb638136fc62b320d41400d454984930f8504f0c WatchSource:0}: Error finding container 7c250af9fee8d6ec033a932ceb638136fc62b320d41400d454984930f8504f0c: Status 404 returned error can't find the container with id 7c250af9fee8d6ec033a932ceb638136fc62b320d41400d454984930f8504f0c Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.192986 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:54:54 crc kubenswrapper[4695]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:54:54 crc kubenswrapper[4695]: if [[ -f "/env/_master" ]]; then Mar 20 10:54:54 crc kubenswrapper[4695]: set -o allexport Mar 20 10:54:54 crc kubenswrapper[4695]: source "/env/_master" Mar 20 10:54:54 crc kubenswrapper[4695]: set +o allexport Mar 20 10:54:54 crc kubenswrapper[4695]: fi Mar 20 10:54:54 crc kubenswrapper[4695]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 10:54:54 crc kubenswrapper[4695]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 10:54:54 crc kubenswrapper[4695]: ho_enable="--enable-hybrid-overlay" Mar 20 10:54:54 crc kubenswrapper[4695]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 10:54:54 crc kubenswrapper[4695]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 10:54:54 crc kubenswrapper[4695]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 10:54:54 crc kubenswrapper[4695]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:54:54 crc kubenswrapper[4695]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 10:54:54 crc kubenswrapper[4695]: --webhook-host=127.0.0.1 \ Mar 20 10:54:54 crc kubenswrapper[4695]: --webhook-port=9743 \ Mar 20 10:54:54 crc kubenswrapper[4695]: ${ho_enable} \ Mar 20 10:54:54 crc kubenswrapper[4695]: --enable-interconnect \ Mar 20 10:54:54 crc kubenswrapper[4695]: --disable-approver \ Mar 20 10:54:54 crc kubenswrapper[4695]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 10:54:54 crc kubenswrapper[4695]: --wait-for-kubernetes-api=200s \ Mar 20 10:54:54 crc kubenswrapper[4695]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 10:54:54 crc kubenswrapper[4695]: --loglevel="${LOGLEVEL}" Mar 20 10:54:54 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:54:54 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.193623 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:54:54 crc kubenswrapper[4695]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 10:54:54 crc kubenswrapper[4695]: set -o allexport Mar 20 10:54:54 crc kubenswrapper[4695]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 10:54:54 crc kubenswrapper[4695]: source /etc/kubernetes/apiserver-url.env Mar 20 10:54:54 crc kubenswrapper[4695]: else Mar 20 10:54:54 crc kubenswrapper[4695]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 10:54:54 crc kubenswrapper[4695]: exit 1 Mar 20 10:54:54 crc kubenswrapper[4695]: fi Mar 20 10:54:54 crc kubenswrapper[4695]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 10:54:54 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:54:54 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.195087 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.195165 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:54:54 crc kubenswrapper[4695]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:54:54 crc kubenswrapper[4695]: if [[ -f "/env/_master" ]]; then Mar 20 10:54:54 crc kubenswrapper[4695]: set -o allexport Mar 20 10:54:54 crc kubenswrapper[4695]: source "/env/_master" Mar 20 10:54:54 crc kubenswrapper[4695]: set +o allexport Mar 20 10:54:54 crc kubenswrapper[4695]: fi Mar 20 10:54:54 crc kubenswrapper[4695]: Mar 20 10:54:54 crc kubenswrapper[4695]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 10:54:54 crc kubenswrapper[4695]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:54:54 crc kubenswrapper[4695]: --disable-webhook \ Mar 20 10:54:54 crc kubenswrapper[4695]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 10:54:54 crc kubenswrapper[4695]: --loglevel="${LOGLEVEL}" Mar 20 10:54:54 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:54:54 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.196323 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 10:54:54 crc kubenswrapper[4695]: W0320 10:54:54.200876 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-6aae9b6bdb50c839bfcdee61ffaabb17299c8a9db31723df84c53b2e69dc8d83 WatchSource:0}: Error finding container 6aae9b6bdb50c839bfcdee61ffaabb17299c8a9db31723df84c53b2e69dc8d83: Status 404 returned error can't find the container with id 6aae9b6bdb50c839bfcdee61ffaabb17299c8a9db31723df84c53b2e69dc8d83 Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.203371 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.204671 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.291846 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.292170 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.292308 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.292409 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.292499 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.395407 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.395449 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.395460 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.395475 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.395486 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.498819 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.498861 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.498877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.498900 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.498937 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.584376 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.584618 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:54:55.584588313 +0000 UTC m=+73.365193876 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.585046 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.585196 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.585347 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.585432 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585242 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585647 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585739 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585861 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:55.585839457 +0000 UTC m=+73.366445020 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585289 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585442 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.586159 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:55.586117154 +0000 UTC m=+73.366722757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.585494 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.586299 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.586315 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.586325 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:55.586260968 +0000 UTC m=+73.366866681 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:54 crc kubenswrapper[4695]: E0320 10:54:54.586355 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:55.58634414 +0000 UTC m=+73.366949703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.603092 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.603154 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.603170 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.603192 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.603209 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.705844 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.705900 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.705935 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.705958 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.705973 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.808557 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.808603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.808616 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.808634 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.808646 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.894458 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.895166 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.896886 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.898212 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.899285 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.900078 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.901670 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.902445 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.903863 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.904679 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.905201 4695 scope.go:117] "RemoveContainer" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.906015 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.907347 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.908601 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.910410 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.910444 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.910454 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.910474 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.910484 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:54Z","lastTransitionTime":"2026-03-20T10:54:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.910563 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.911191 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.912578 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.913467 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.914540 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.915188 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.915828 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.916824 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.917994 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.918757 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.919930 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.921090 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.921977 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.925888 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.926480 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.927980 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.928865 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.930717 4695 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.930861 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.932766 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.934134 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.934544 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.936245 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.937772 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.938501 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.939871 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.940692 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.941782 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.942541 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.943973 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.944701 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.945812 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.946631 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.947366 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.948725 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.949223 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.950040 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.950491 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.951007 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.952005 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.952453 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 10:54:54 crc kubenswrapper[4695]: I0320 10:54:54.953282 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.023822 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.023884 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.023928 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.023951 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.023964 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.126660 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.127147 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.127157 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.127179 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.127194 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.158850 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6aae9b6bdb50c839bfcdee61ffaabb17299c8a9db31723df84c53b2e69dc8d83"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.162495 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3a8e501c50a11b4c3fa754f855b2c9cd8485e782c02342fc354a1504e76e3c9a"} Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.162970 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.164080 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:54:55 crc kubenswrapper[4695]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:54:55 crc kubenswrapper[4695]: if [[ -f "/env/_master" ]]; then Mar 20 10:54:55 crc kubenswrapper[4695]: set -o allexport Mar 20 10:54:55 crc kubenswrapper[4695]: source "/env/_master" Mar 20 10:54:55 crc kubenswrapper[4695]: set +o allexport Mar 20 10:54:55 crc kubenswrapper[4695]: fi Mar 20 10:54:55 crc kubenswrapper[4695]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 10:54:55 crc kubenswrapper[4695]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 10:54:55 crc kubenswrapper[4695]: ho_enable="--enable-hybrid-overlay" Mar 20 10:54:55 crc kubenswrapper[4695]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 10:54:55 crc kubenswrapper[4695]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 10:54:55 crc kubenswrapper[4695]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 10:54:55 crc kubenswrapper[4695]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:54:55 crc kubenswrapper[4695]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 10:54:55 crc kubenswrapper[4695]: --webhook-host=127.0.0.1 \ Mar 20 10:54:55 crc kubenswrapper[4695]: --webhook-port=9743 \ Mar 20 10:54:55 crc kubenswrapper[4695]: ${ho_enable} \ Mar 20 10:54:55 crc kubenswrapper[4695]: --enable-interconnect \ Mar 20 10:54:55 crc kubenswrapper[4695]: --disable-approver \ Mar 20 10:54:55 crc kubenswrapper[4695]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 10:54:55 crc kubenswrapper[4695]: --wait-for-kubernetes-api=200s \ Mar 20 10:54:55 crc kubenswrapper[4695]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 10:54:55 crc kubenswrapper[4695]: --loglevel="${LOGLEVEL}" Mar 20 10:54:55 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:54:55 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.164122 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.166445 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:54:55 crc kubenswrapper[4695]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 10:54:55 crc kubenswrapper[4695]: if [[ -f "/env/_master" ]]; then Mar 20 10:54:55 crc kubenswrapper[4695]: set -o allexport Mar 20 10:54:55 crc kubenswrapper[4695]: source "/env/_master" Mar 20 10:54:55 crc kubenswrapper[4695]: set +o allexport Mar 20 10:54:55 crc kubenswrapper[4695]: fi Mar 20 10:54:55 crc kubenswrapper[4695]: Mar 20 10:54:55 crc kubenswrapper[4695]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 10:54:55 crc kubenswrapper[4695]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 10:54:55 crc kubenswrapper[4695]: --disable-webhook \ Mar 20 10:54:55 crc kubenswrapper[4695]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 10:54:55 crc kubenswrapper[4695]: --loglevel="${LOGLEVEL}" Mar 20 10:54:55 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:54:55 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.167135 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"7c250af9fee8d6ec033a932ceb638136fc62b320d41400d454984930f8504f0c"} Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.168193 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.168852 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:54:55 crc kubenswrapper[4695]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 10:54:55 crc kubenswrapper[4695]: set -o allexport Mar 20 10:54:55 crc kubenswrapper[4695]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 10:54:55 crc kubenswrapper[4695]: source /etc/kubernetes/apiserver-url.env Mar 20 10:54:55 crc kubenswrapper[4695]: else Mar 20 10:54:55 crc kubenswrapper[4695]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 10:54:55 crc kubenswrapper[4695]: exit 1 Mar 20 10:54:55 crc kubenswrapper[4695]: fi Mar 20 10:54:55 crc kubenswrapper[4695]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 10:54:55 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 10:54:55 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.170087 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.172653 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.184448 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.194265 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.206816 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.223626 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.229447 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.229477 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.229489 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.229508 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.229521 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.237708 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.251849 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.267220 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.279585 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.290302 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.304707 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.316709 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.326310 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.332761 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.332812 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.332827 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.332850 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.332866 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.338174 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.436654 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.436750 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.436763 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.436785 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.436798 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.539699 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.539761 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.539776 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.539796 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.539820 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.596942 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.597072 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.597103 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.597135 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.597173 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597209 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:54:57.597169079 +0000 UTC m=+75.377774642 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597305 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597340 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597374 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:57.597354314 +0000 UTC m=+75.377959877 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597397 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:57.597389585 +0000 UTC m=+75.377995148 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597524 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597547 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597564 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597602 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:57.59759309 +0000 UTC m=+75.378198873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597681 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597693 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597701 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.597735 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:54:57.597723184 +0000 UTC m=+75.378328957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.642646 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.642691 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.642702 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.642722 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.642738 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.745830 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.745895 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.745938 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.745963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.745979 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.849096 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.849150 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.849164 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.849187 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.849200 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.886926 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.886979 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.887047 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.887150 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.887209 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:54:55 crc kubenswrapper[4695]: E0320 10:54:55.887286 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.955217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.955269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.955283 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.955300 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:55 crc kubenswrapper[4695]: I0320 10:54:55.955311 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:55Z","lastTransitionTime":"2026-03-20T10:54:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.057691 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.057744 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.057754 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.057776 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.057790 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.161206 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.161260 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.161270 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.161289 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.161298 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.172597 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.174615 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.175041 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.187414 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.199865 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.211690 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.226789 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.242858 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.255646 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.264493 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.264538 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.264549 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.264567 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.264581 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.267709 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.368234 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.368313 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.368324 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.368344 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.368357 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.471247 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.471303 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.471319 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.471343 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.471361 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.574244 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.574292 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.574312 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.574332 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.574345 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.676964 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.677004 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.677015 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.677040 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.677052 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.779823 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.779874 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.779890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.780195 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.780226 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.883254 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.883315 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.883336 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.883364 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.883382 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.987102 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.987163 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.987173 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.987195 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:56 crc kubenswrapper[4695]: I0320 10:54:56.987207 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:56Z","lastTransitionTime":"2026-03-20T10:54:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.090293 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.090345 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.090355 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.090373 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.090383 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.192752 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.192800 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.192811 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.192830 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.192845 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.295747 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.295830 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.295852 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.295873 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.295887 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.399275 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.399317 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.399326 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.399418 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.399431 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.503485 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.503559 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.503577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.503604 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.503624 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.606574 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.606620 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.606631 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.606648 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.606661 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.616188 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.616277 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.616301 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.616323 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.616349 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616460 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616525 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:55:01.616483344 +0000 UTC m=+79.397088917 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616523 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616585 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:01.616571836 +0000 UTC m=+79.397177609 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616587 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616577 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616628 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616648 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616652 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616665 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:01.616635338 +0000 UTC m=+79.397241051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616675 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616704 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:01.616688439 +0000 UTC m=+79.397294112 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.616732 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:01.61672161 +0000 UTC m=+79.397327353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.709740 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.709821 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.709837 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.709857 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.709868 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.812942 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.812996 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.813053 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.813069 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.813081 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.886525 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.886685 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.886698 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.886993 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.887074 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:54:57 crc kubenswrapper[4695]: E0320 10:54:57.887189 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.915281 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.915312 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.915321 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.915336 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:57 crc kubenswrapper[4695]: I0320 10:54:57.915348 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:57Z","lastTransitionTime":"2026-03-20T10:54:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.018583 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.018644 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.018675 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.018698 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.018711 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.121308 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.121360 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.121372 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.121394 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.121411 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.224226 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.224276 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.224287 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.224307 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.224321 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.326751 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.326816 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.326835 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.326853 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.326866 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.430217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.430333 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.430349 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.430370 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.430381 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.533631 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.533701 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.533714 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.533744 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.533762 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.636707 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.636771 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.636782 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.636846 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.636859 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.740791 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.740859 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.740876 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.740900 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.740974 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.843651 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.843707 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.843723 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.843746 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.843762 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.945870 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.945923 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.945933 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.945948 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:58 crc kubenswrapper[4695]: I0320 10:54:58.945959 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:58Z","lastTransitionTime":"2026-03-20T10:54:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.049414 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.049469 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.049479 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.049497 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.049510 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.152495 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.152538 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.152548 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.152572 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.152584 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.255038 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.255091 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.255101 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.255119 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.255133 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.358091 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.358136 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.358148 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.358167 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.358179 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.461177 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.461226 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.461236 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.461251 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.461262 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.564709 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.564763 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.564777 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.564798 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.564811 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.668599 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.668681 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.668702 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.668727 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.668743 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.771314 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.771376 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.771387 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.771405 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.771418 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.873846 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.873891 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.873933 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.873952 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.873965 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.886776 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.886776 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.886811 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:54:59 crc kubenswrapper[4695]: E0320 10:54:59.887282 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:54:59 crc kubenswrapper[4695]: E0320 10:54:59.886962 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:54:59 crc kubenswrapper[4695]: E0320 10:54:59.887074 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.976761 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.976818 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.976830 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.976848 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:54:59 crc kubenswrapper[4695]: I0320 10:54:59.976862 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:54:59Z","lastTransitionTime":"2026-03-20T10:54:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.080268 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.080337 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.080360 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.080390 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.080463 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.184026 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.184087 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.184101 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.184119 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.184133 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.287088 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.287139 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.287150 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.287167 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.287177 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.391603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.391675 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.391688 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.391711 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.391724 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.494900 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.494974 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.494989 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.495009 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.495022 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.598110 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.598161 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.598174 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.598198 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.598211 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.701175 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.701242 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.701257 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.701281 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.701297 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.805071 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.805139 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.805155 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.805179 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.805195 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.907724 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.907782 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.907798 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.907820 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:00 crc kubenswrapper[4695]: I0320 10:55:00.907843 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:00Z","lastTransitionTime":"2026-03-20T10:55:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.010901 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.010973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.010988 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.011012 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.011027 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.113822 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.113870 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.113883 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.113903 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.113930 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.216293 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.216357 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.216369 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.216389 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.216402 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.318791 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.319228 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.319372 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.319456 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.319580 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.422725 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.422796 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.422814 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.422837 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.422854 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.525788 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.525877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.525886 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.525902 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.525933 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.628838 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.629193 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.629440 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.629618 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.629805 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.661491 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.661632 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.661655 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.661674 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.661693 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.661812 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.661867 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:09.661852444 +0000 UTC m=+87.442458007 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662185 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662227 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662257 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662281 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662297 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662318 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:09.662285916 +0000 UTC m=+87.442891639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662259 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662337 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662344 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:09.662333757 +0000 UTC m=+87.442939540 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662362 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:09.662352398 +0000 UTC m=+87.442957961 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.662677 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:55:09.662660126 +0000 UTC m=+87.443265689 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.732487 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.732536 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.732547 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.732568 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.732580 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.835546 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.835603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.835622 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.835647 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.835661 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.886528 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.886609 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.886859 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.887007 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.887114 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:01 crc kubenswrapper[4695]: E0320 10:55:01.887206 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.898714 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.938812 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.938854 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.938866 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.938886 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:01 crc kubenswrapper[4695]: I0320 10:55:01.938900 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:01Z","lastTransitionTime":"2026-03-20T10:55:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.042043 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.042098 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.042111 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.042129 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.042141 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.145329 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.145364 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.145373 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.145390 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.145402 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.248122 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.248169 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.248177 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.248193 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.248203 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.351316 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.351351 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.351362 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.351380 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.351413 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.367547 4695 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.453766 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.453829 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.453842 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.453862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.453873 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.557997 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.558086 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.558108 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.558139 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.558168 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.661124 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.661192 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.661216 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.661244 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.661285 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.764254 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.764307 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.764321 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.764344 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.764358 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.868406 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.868466 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.868477 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.868497 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.868518 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.904245 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.918766 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.931526 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.942335 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.959704 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.971527 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.971603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.971618 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.971641 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.971659 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:02Z","lastTransitionTime":"2026-03-20T10:55:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.971891 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.989153 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:02 crc kubenswrapper[4695]: I0320 10:55:02.998547 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.074305 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.074350 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.074364 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.074380 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.074393 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.177490 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.177556 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.177570 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.177651 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.177670 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.279722 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.279762 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.279772 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.279787 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.279799 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.382999 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.383083 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.383102 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.383121 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.383133 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.485207 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.485246 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.485255 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.485270 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.485279 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.517829 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.517877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.517888 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.517922 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.517936 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.529962 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.535262 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.535333 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.535353 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.535380 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.535398 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.550856 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.555035 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.555078 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.555088 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.555106 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.555117 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.569472 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.573854 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.573925 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.573942 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.573962 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.573978 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.585202 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.589377 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.589412 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.589426 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.589445 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.589460 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.599927 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.600070 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.601823 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.601887 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.601923 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.601944 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.601961 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.704890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.704990 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.705011 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.705050 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.705075 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.807601 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.807661 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.807676 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.807697 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.807710 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.886536 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.886677 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.886776 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.886835 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.886966 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:03 crc kubenswrapper[4695]: E0320 10:55:03.887065 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.910680 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.910721 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.910729 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.910743 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:03 crc kubenswrapper[4695]: I0320 10:55:03.910755 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:03Z","lastTransitionTime":"2026-03-20T10:55:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.014371 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.014432 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.014448 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.014467 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.014479 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.118178 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.118245 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.118265 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.118291 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.118312 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.221780 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.221848 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.221865 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.221891 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.221924 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.324513 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.324570 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.324580 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.324599 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.324609 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.427524 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.427575 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.427587 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.427608 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.427623 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.529641 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.529685 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.529694 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.529711 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.529722 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.633071 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.633132 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.633145 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.633214 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.633232 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.736460 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.736515 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.736526 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.736544 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.736555 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.802338 4695 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.839468 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.839514 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.839525 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.839544 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.839556 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.942817 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.942869 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.942882 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.942901 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:04 crc kubenswrapper[4695]: I0320 10:55:04.942932 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:04Z","lastTransitionTime":"2026-03-20T10:55:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.045805 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.045847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.045857 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.045873 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.045885 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.148949 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.148997 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.149011 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.149032 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.149046 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.251339 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.251408 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.251419 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.251439 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.251453 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.353971 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.354018 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.354031 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.354072 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.354110 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.456757 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.456826 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.456844 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.456870 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.456882 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.526297 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.536603 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.546769 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.556495 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.559680 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.559764 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.559784 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.559850 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.559989 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.567069 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.577474 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.589798 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.599676 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.608934 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.662825 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.662894 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.662924 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.662949 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.662967 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.767313 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.767368 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.767378 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.767402 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.767415 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.870405 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.870455 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.870464 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.870483 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.870497 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.886871 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.887050 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:05 crc kubenswrapper[4695]: E0320 10:55:05.887190 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.887285 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:05 crc kubenswrapper[4695]: E0320 10:55:05.887727 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:05 crc kubenswrapper[4695]: E0320 10:55:05.887929 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.908033 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.973582 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.973624 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.973634 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.973651 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:05 crc kubenswrapper[4695]: I0320 10:55:05.973665 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:05Z","lastTransitionTime":"2026-03-20T10:55:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.076850 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.076891 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.076923 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.076945 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.076962 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.179758 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.179803 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.179815 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.179834 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.179855 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.202010 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.221440 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.235142 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.245068 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.271803 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.282216 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.282271 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.282282 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.282298 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.282310 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.286291 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.298614 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.308444 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.318214 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.327082 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.385540 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.385592 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.385603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.385623 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.385639 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.488616 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.488670 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.488682 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.488703 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.488716 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.592233 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.592290 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.592303 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.592361 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.592373 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.695208 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.695280 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.695300 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.695327 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.695345 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.798657 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.798738 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.798753 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.798780 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.798798 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.901934 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.901979 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.901992 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.902016 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:06 crc kubenswrapper[4695]: I0320 10:55:06.902031 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:06Z","lastTransitionTime":"2026-03-20T10:55:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.004886 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.004953 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.004966 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.004986 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.005001 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.107902 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.107962 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.107974 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.107995 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.108009 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.206629 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.206698 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.210694 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.210753 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.210768 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.210788 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.210804 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.221474 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.234400 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.268322 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.285406 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.305297 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.313379 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.313726 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.313800 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.313869 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.313975 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.322442 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.335517 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.348299 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.372149 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.416378 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.416430 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.416443 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.416461 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.416473 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.519768 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.519811 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.519821 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.519838 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.519850 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.622351 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.622726 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.622814 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.622968 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.623051 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.702798 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-75zwx"] Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.703675 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.706439 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.706615 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.708581 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.730952 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.731080 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/548a9f17-19d1-4267-a179-75a82fe79a42-hosts-file\") pod \"node-resolver-75zwx\" (UID: \"548a9f17-19d1-4267-a179-75a82fe79a42\") " pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.731125 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvhmk\" (UniqueName: \"kubernetes.io/projected/548a9f17-19d1-4267-a179-75a82fe79a42-kube-api-access-gvhmk\") pod \"node-resolver-75zwx\" (UID: \"548a9f17-19d1-4267-a179-75a82fe79a42\") " pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.732177 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.732232 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.732243 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.732263 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.732275 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.757176 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.771369 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.784715 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.799243 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.817118 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.832290 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/548a9f17-19d1-4267-a179-75a82fe79a42-hosts-file\") pod \"node-resolver-75zwx\" (UID: \"548a9f17-19d1-4267-a179-75a82fe79a42\") " pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.832350 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvhmk\" (UniqueName: \"kubernetes.io/projected/548a9f17-19d1-4267-a179-75a82fe79a42-kube-api-access-gvhmk\") pod \"node-resolver-75zwx\" (UID: \"548a9f17-19d1-4267-a179-75a82fe79a42\") " pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.832482 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/548a9f17-19d1-4267-a179-75a82fe79a42-hosts-file\") pod \"node-resolver-75zwx\" (UID: \"548a9f17-19d1-4267-a179-75a82fe79a42\") " pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.834598 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.834656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.834672 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.834691 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.834701 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.835792 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.855337 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvhmk\" (UniqueName: \"kubernetes.io/projected/548a9f17-19d1-4267-a179-75a82fe79a42-kube-api-access-gvhmk\") pod \"node-resolver-75zwx\" (UID: \"548a9f17-19d1-4267-a179-75a82fe79a42\") " pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.858931 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.876746 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.886281 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:07 crc kubenswrapper[4695]: E0320 10:55:07.888348 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.886216 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:07 crc kubenswrapper[4695]: E0320 10:55:07.888554 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.886959 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:07 crc kubenswrapper[4695]: E0320 10:55:07.888635 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.898975 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.938139 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.938193 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.938205 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.938223 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:07 crc kubenswrapper[4695]: I0320 10:55:07.938236 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:07Z","lastTransitionTime":"2026-03-20T10:55:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.025294 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-75zwx" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.041178 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.041223 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.041258 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.041282 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.041297 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.096305 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hg7g5"] Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.096840 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.098532 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-6jlvp"] Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.099180 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-bnwz5"] Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.099495 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.100014 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103302 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103535 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103616 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103698 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103756 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103801 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103839 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.103772 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.104153 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.104294 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.105174 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.106567 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.130731 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134347 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-system-cni-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134382 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52301735-de4f-4672-9e4d-6bd74bccedad-cni-binary-copy\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134412 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-kubelet\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134443 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-cni-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134471 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-k8s-cni-cncf-io\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134494 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/52301735-de4f-4672-9e4d-6bd74bccedad-multus-daemon-config\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134517 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-multus-certs\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134540 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-netns\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134563 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-etc-kubernetes\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134587 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-cni-bin\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134611 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-hostroot\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134732 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-cnibin\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134776 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-os-release\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.134892 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-socket-dir-parent\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.135071 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-cni-multus\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.135111 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-conf-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.135142 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69bmq\" (UniqueName: \"kubernetes.io/projected/52301735-de4f-4672-9e4d-6bd74bccedad-kube-api-access-69bmq\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.148499 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.148544 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.148553 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.148574 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.148587 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.149506 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.167901 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.183048 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.200359 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.210684 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75zwx" event={"ID":"548a9f17-19d1-4267-a179-75a82fe79a42","Type":"ContainerStarted","Data":"2198561b8391c556c3c31bb898d36730deceea0aa92c5020f00c6b7dfb3c1db3"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.215348 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.234216 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235576 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-cni-multus\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235611 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpk42\" (UniqueName: \"kubernetes.io/projected/0500a369-efac-495f-83aa-8b400fd54206-kube-api-access-tpk42\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235632 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-kubelet\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235651 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-system-cni-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235670 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52301735-de4f-4672-9e4d-6bd74bccedad-cni-binary-copy\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235688 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-system-cni-dir\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235709 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/52301735-de4f-4672-9e4d-6bd74bccedad-multus-daemon-config\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235728 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-multus-certs\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235754 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0500a369-efac-495f-83aa-8b400fd54206-cni-binary-copy\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235777 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7859c924-84d7-4855-901e-c77a02c56e3a-proxy-tls\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235769 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-cni-multus\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235841 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-netns\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235796 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-netns\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235883 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-system-cni-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235962 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-multus-certs\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.235894 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-os-release\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236024 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-cni-bin\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236031 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-kubelet\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236051 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-hostroot\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236079 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-hostroot\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236106 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-var-lib-cni-bin\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236123 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-os-release\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236159 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236180 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw6vr\" (UniqueName: \"kubernetes.io/projected/7859c924-84d7-4855-901e-c77a02c56e3a-kube-api-access-lw6vr\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236211 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-socket-dir-parent\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236229 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7859c924-84d7-4855-901e-c77a02c56e3a-rootfs\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236243 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-os-release\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236252 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-conf-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236276 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69bmq\" (UniqueName: \"kubernetes.io/projected/52301735-de4f-4672-9e4d-6bd74bccedad-kube-api-access-69bmq\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236290 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-conf-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236301 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-cnibin\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236324 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-cni-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236343 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-k8s-cni-cncf-io\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236367 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7859c924-84d7-4855-901e-c77a02c56e3a-mcd-auth-proxy-config\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236389 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-etc-kubernetes\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236422 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-cnibin\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236442 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0500a369-efac-495f-83aa-8b400fd54206-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236443 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-socket-dir-parent\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236517 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-host-run-k8s-cni-cncf-io\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236526 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-etc-kubernetes\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236583 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-cnibin\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236626 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/52301735-de4f-4672-9e4d-6bd74bccedad-multus-daemon-config\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236734 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/52301735-de4f-4672-9e4d-6bd74bccedad-multus-cni-dir\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.236778 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/52301735-de4f-4672-9e4d-6bd74bccedad-cni-binary-copy\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.246440 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.250577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.250642 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.250657 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.250683 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.250697 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.265848 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.266418 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69bmq\" (UniqueName: \"kubernetes.io/projected/52301735-de4f-4672-9e4d-6bd74bccedad-kube-api-access-69bmq\") pod \"multus-hg7g5\" (UID: \"52301735-de4f-4672-9e4d-6bd74bccedad\") " pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.283997 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.300386 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.312563 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.329706 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.337620 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0500a369-efac-495f-83aa-8b400fd54206-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.337676 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpk42\" (UniqueName: \"kubernetes.io/projected/0500a369-efac-495f-83aa-8b400fd54206-kube-api-access-tpk42\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338109 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-system-cni-dir\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338137 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-os-release\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338234 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0500a369-efac-495f-83aa-8b400fd54206-cni-binary-copy\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338259 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7859c924-84d7-4855-901e-c77a02c56e3a-proxy-tls\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338287 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338201 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-system-cni-dir\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338372 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw6vr\" (UniqueName: \"kubernetes.io/projected/7859c924-84d7-4855-901e-c77a02c56e3a-kube-api-access-lw6vr\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338437 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7859c924-84d7-4855-901e-c77a02c56e3a-rootfs\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338661 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7859c924-84d7-4855-901e-c77a02c56e3a-rootfs\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338710 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-cnibin\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338678 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0500a369-efac-495f-83aa-8b400fd54206-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338764 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7859c924-84d7-4855-901e-c77a02c56e3a-mcd-auth-proxy-config\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338770 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-cnibin\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.338950 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.339186 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0500a369-efac-495f-83aa-8b400fd54206-os-release\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.339397 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0500a369-efac-495f-83aa-8b400fd54206-cni-binary-copy\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.339605 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7859c924-84d7-4855-901e-c77a02c56e3a-mcd-auth-proxy-config\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.342171 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7859c924-84d7-4855-901e-c77a02c56e3a-proxy-tls\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.345778 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.354004 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.354043 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.354052 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.354071 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.354083 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.357258 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpk42\" (UniqueName: \"kubernetes.io/projected/0500a369-efac-495f-83aa-8b400fd54206-kube-api-access-tpk42\") pod \"multus-additional-cni-plugins-6jlvp\" (UID: \"0500a369-efac-495f-83aa-8b400fd54206\") " pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.357710 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw6vr\" (UniqueName: \"kubernetes.io/projected/7859c924-84d7-4855-901e-c77a02c56e3a-kube-api-access-lw6vr\") pod \"machine-config-daemon-bnwz5\" (UID: \"7859c924-84d7-4855-901e-c77a02c56e3a\") " pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.358961 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.381551 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.397008 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.417832 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.419373 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hg7g5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.429868 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.434418 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.438802 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" Mar 20 10:55:08 crc kubenswrapper[4695]: W0320 10:55:08.446038 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52301735_de4f_4672_9e4d_6bd74bccedad.slice/crio-f70e19a426442a41514fc9348d3a9d2ea9d7ed196b11f5038818e5d44f1b9dd1 WatchSource:0}: Error finding container f70e19a426442a41514fc9348d3a9d2ea9d7ed196b11f5038818e5d44f1b9dd1: Status 404 returned error can't find the container with id f70e19a426442a41514fc9348d3a9d2ea9d7ed196b11f5038818e5d44f1b9dd1 Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.450293 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.457940 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.457995 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.458009 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.458029 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.458042 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: W0320 10:55:08.460714 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0500a369_efac_495f_83aa_8b400fd54206.slice/crio-897c29581bbdec6887cddfdfc5f4602c0fa2880ba19aa93724bb57650c71bf39 WatchSource:0}: Error finding container 897c29581bbdec6887cddfdfc5f4602c0fa2880ba19aa93724bb57650c71bf39: Status 404 returned error can't find the container with id 897c29581bbdec6887cddfdfc5f4602c0fa2880ba19aa93724bb57650c71bf39 Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.471553 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.479986 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx4bc"] Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.491418 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.492520 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.495630 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.496581 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.496750 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.496838 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.496936 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.499386 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.500266 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.508995 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.522793 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.540954 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-netns\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541005 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-node-log\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541020 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-ovn-kubernetes\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541040 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541060 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-script-lib\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541087 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-kubelet\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541270 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-log-socket\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541356 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-ovn\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541411 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541465 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-netd\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541555 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-config\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541595 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdz7d\" (UniqueName: \"kubernetes.io/projected/7010d107-c3b1-4cc2-83c2-523df13ecd43-kube-api-access-qdz7d\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.541640 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-var-lib-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.542676 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-systemd\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.542806 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-etc-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.542880 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-bin\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.542936 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovn-node-metrics-cert\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.542970 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-env-overrides\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.543060 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-slash\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.543138 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-systemd-units\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.544151 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.557661 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.564126 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.564963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.564987 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.565016 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.565027 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.573443 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.589201 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.612355 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.628124 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644225 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-kubelet\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644270 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-ovn\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644293 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-log-socket\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644317 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644339 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-netd\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644360 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-config\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644380 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdz7d\" (UniqueName: \"kubernetes.io/projected/7010d107-c3b1-4cc2-83c2-523df13ecd43-kube-api-access-qdz7d\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644413 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-var-lib-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644444 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-systemd\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644467 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-etc-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644493 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-bin\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644518 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovn-node-metrics-cert\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644539 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-env-overrides\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644558 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-slash\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644581 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-systemd-units\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644609 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-netns\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644631 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-script-lib\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644662 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-node-log\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644676 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-ovn-kubernetes\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644700 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644785 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644843 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-kubelet\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644876 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-ovn\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644925 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-log-socket\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644955 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.644983 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-netd\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.645938 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-systemd\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.645945 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-netns\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646030 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-node-log\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646327 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-systemd-units\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646614 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-ovn-kubernetes\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646707 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-bin\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646660 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-etc-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646728 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-script-lib\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.646943 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-slash\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.647116 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-config\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.647138 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-var-lib-openvswitch\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.648247 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-env-overrides\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.653753 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.666556 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovn-node-metrics-cert\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.667013 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdz7d\" (UniqueName: \"kubernetes.io/projected/7010d107-c3b1-4cc2-83c2-523df13ecd43-kube-api-access-qdz7d\") pod \"ovnkube-node-nx4bc\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.671188 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.671599 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.671614 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.671635 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.671649 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.684481 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.699242 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.723704 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.757571 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.771448 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.774574 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.774634 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.774650 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.774673 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.774687 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.787608 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.801039 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.877833 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.877883 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.877892 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.877927 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.877940 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.927634 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:08 crc kubenswrapper[4695]: W0320 10:55:08.948141 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7010d107_c3b1_4cc2_83c2_523df13ecd43.slice/crio-f0a97ee302090f2da0e2af64936cace6f5bfeff8d85c49d32e937fee4400cc7d WatchSource:0}: Error finding container f0a97ee302090f2da0e2af64936cace6f5bfeff8d85c49d32e937fee4400cc7d: Status 404 returned error can't find the container with id f0a97ee302090f2da0e2af64936cace6f5bfeff8d85c49d32e937fee4400cc7d Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.980857 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.980950 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.980967 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.980985 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:08 crc kubenswrapper[4695]: I0320 10:55:08.980999 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:08Z","lastTransitionTime":"2026-03-20T10:55:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.087001 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.087061 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.087079 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.087101 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.087112 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.191065 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.191129 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.191143 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.191168 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.191184 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.222227 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" exitCode=0 Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.222329 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.222368 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"f0a97ee302090f2da0e2af64936cace6f5bfeff8d85c49d32e937fee4400cc7d"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.225483 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerStarted","Data":"013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.225564 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerStarted","Data":"897c29581bbdec6887cddfdfc5f4602c0fa2880ba19aa93724bb57650c71bf39"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.235203 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-75zwx" event={"ID":"548a9f17-19d1-4267-a179-75a82fe79a42","Type":"ContainerStarted","Data":"d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.243522 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.245631 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.245686 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.245700 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"1d83951c92bb9816ab7a7b4eeefa14e332f8e4c6721cd4f07435010ea29b32e2"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.248990 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerStarted","Data":"4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.249035 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerStarted","Data":"f70e19a426442a41514fc9348d3a9d2ea9d7ed196b11f5038818e5d44f1b9dd1"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.280155 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.300390 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.300437 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.300449 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.300487 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.300501 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.312686 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.350853 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.375995 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.394015 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.403568 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.403600 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.403610 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.403628 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.403643 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.422186 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.440257 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.459076 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.477833 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.494662 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.511965 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.516589 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.516627 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.516639 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.516660 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.516674 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.530966 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.548561 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.572828 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.590573 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.611126 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.622597 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.622673 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.622688 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.622715 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.622731 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.628818 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.655155 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.669262 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.688822 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.707127 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.726379 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.726446 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.726456 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.726473 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.726482 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.731900 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.746015 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.754369 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.754491 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.754522 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.754556 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.754582 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754749 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754773 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754787 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754832 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:25.754817024 +0000 UTC m=+103.535422587 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754891 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:55:25.754883005 +0000 UTC m=+103.535488568 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754966 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754980 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.754989 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.755018 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:25.755008409 +0000 UTC m=+103.535613972 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.755075 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.755103 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:25.755095151 +0000 UTC m=+103.535700714 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.755140 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.755166 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:25.755158923 +0000 UTC m=+103.535764486 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.758236 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.776412 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.796991 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.815303 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.828838 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.828890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.828899 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.828930 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.828942 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.886160 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.886191 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.886346 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.886790 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.886861 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:09 crc kubenswrapper[4695]: E0320 10:55:09.887023 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.934810 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.935564 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.935683 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.935751 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:09 crc kubenswrapper[4695]: I0320 10:55:09.935842 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:09Z","lastTransitionTime":"2026-03-20T10:55:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.038368 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.038406 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.038416 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.038433 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.038444 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.141337 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.141372 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.141382 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.141400 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.141411 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.246814 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.246850 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.246860 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.246876 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.246886 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.259464 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.259525 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.259537 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.259546 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.259556 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.259566 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.260696 4695 generic.go:334] "Generic (PLEG): container finished" podID="0500a369-efac-495f-83aa-8b400fd54206" containerID="013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13" exitCode=0 Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.260777 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerDied","Data":"013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.292932 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.311103 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.331680 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.346772 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.349591 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.349799 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.349877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.349979 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.350056 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.371931 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.398902 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.416551 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.434648 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.451404 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.462033 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.462098 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.462119 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.462145 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.462160 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.470254 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.485019 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.504171 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.527385 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.565472 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.565512 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.565561 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.565580 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.565590 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.567386 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.668757 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.668790 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.668799 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.668817 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.668828 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.772370 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.772404 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.772414 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.772431 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.772444 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.875027 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.875087 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.875102 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.875125 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.875141 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.981357 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.981686 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.981697 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.981717 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:10 crc kubenswrapper[4695]: I0320 10:55:10.981730 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:10Z","lastTransitionTime":"2026-03-20T10:55:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.084321 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.084370 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.084381 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.084403 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.084417 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.187226 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.187281 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.187292 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.187311 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.187322 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.267241 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerStarted","Data":"84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.284671 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.289582 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.289637 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.289650 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.289672 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.289684 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.299829 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.312305 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.329763 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.358635 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.383728 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.392608 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.392660 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.392670 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.392692 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.392705 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.410766 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.427169 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.444030 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.479379 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.495715 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.496065 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.496104 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.496115 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.496133 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.496145 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.511735 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.529443 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.548246 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:11Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.599123 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.599716 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.599789 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.599855 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.599981 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.702965 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.703275 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.703383 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.703620 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.703765 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.809559 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.809683 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.809769 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.809844 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.809926 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.886551 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.886579 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.886720 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:11 crc kubenswrapper[4695]: E0320 10:55:11.886865 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:11 crc kubenswrapper[4695]: E0320 10:55:11.887334 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:11 crc kubenswrapper[4695]: E0320 10:55:11.887474 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.912595 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.912951 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.913211 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.913434 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:11 crc kubenswrapper[4695]: I0320 10:55:11.913646 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:11Z","lastTransitionTime":"2026-03-20T10:55:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.017117 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.017150 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.017158 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.017174 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.017184 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.121447 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.121748 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.122040 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.122269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.122535 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.226046 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.226095 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.226108 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.226130 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.226143 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.279037 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.280899 4695 generic.go:334] "Generic (PLEG): container finished" podID="0500a369-efac-495f-83aa-8b400fd54206" containerID="84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6" exitCode=0 Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.281005 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerDied","Data":"84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.293753 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.308294 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.324797 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.339535 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.339613 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.339628 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.339678 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.339691 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.346970 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.361923 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.377435 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.396079 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.411935 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.429084 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.446164 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.446207 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.446217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.446233 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.446246 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.452121 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.485735 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.504974 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.520318 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.534216 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.547325 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.549482 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.549540 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.549555 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.549573 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.549584 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.561756 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.576712 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.592625 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.609473 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.625620 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.638621 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.652192 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.652248 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.652259 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.652280 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.652291 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.653608 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.667041 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.682335 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.703555 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.726315 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.742074 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.755834 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.755880 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.755892 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.755932 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.755943 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.759167 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.858129 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.858175 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.858185 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.858197 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.858207 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.903558 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.918864 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.932500 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.945894 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.958283 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.960507 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.960562 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.960577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.960602 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.960619 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:12Z","lastTransitionTime":"2026-03-20T10:55:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.972745 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:12 crc kubenswrapper[4695]: I0320 10:55:12.984892 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.001873 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.013559 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.029816 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.049756 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.063581 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.063623 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.063637 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.063658 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.063672 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.064793 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.091998 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.117881 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.166141 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.166188 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.166200 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.166217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.166229 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.269188 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.269256 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.269269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.269290 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.269306 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.288777 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.291224 4695 generic.go:334] "Generic (PLEG): container finished" podID="0500a369-efac-495f-83aa-8b400fd54206" containerID="e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d" exitCode=0 Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.291261 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerDied","Data":"e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.312414 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.329190 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.344491 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.361103 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.373400 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.373445 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.373456 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.373479 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.373492 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.384595 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.399401 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.416835 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.440326 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.458584 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.470504 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.475425 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.475459 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.475470 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.475486 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.475499 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.482173 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.495876 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.509108 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.532661 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.578297 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.578368 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.578382 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.578410 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.578426 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.680991 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.681343 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.681436 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.681520 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.681594 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.785066 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.785428 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.785531 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.785641 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.785735 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.880562 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.880618 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.880634 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.880663 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.880683 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.886209 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.886398 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.886375 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.886667 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.886876 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.886996 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.903777 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.908844 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.908893 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.908920 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.908947 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.908963 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.923032 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.927370 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.927419 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.927433 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.927455 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.927469 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.950036 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.953827 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.953864 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.953873 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.953890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.953901 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:13 crc kubenswrapper[4695]: E0320 10:55:13.975839 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.979972 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.980044 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.980055 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.980077 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:13 crc kubenswrapper[4695]: I0320 10:55:13.980090 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:13Z","lastTransitionTime":"2026-03-20T10:55:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: E0320 10:55:14.000767 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: E0320 10:55:14.000971 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.003963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.004001 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.004013 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.004038 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.004055 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.107577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.107618 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.107628 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.107643 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.107654 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.210251 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.210297 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.210307 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.210326 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.210342 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.297595 4695 generic.go:334] "Generic (PLEG): container finished" podID="0500a369-efac-495f-83aa-8b400fd54206" containerID="27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648" exitCode=0 Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.297669 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerDied","Data":"27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.314003 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.314058 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.314071 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.314091 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.314104 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.341387 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.394487 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.417529 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.417578 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.417591 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.417609 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.417624 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.420419 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.441475 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.464773 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.477525 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.491895 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.508625 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.523181 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.523226 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.523239 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.523288 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.523303 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.525158 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.539344 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.555558 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.574558 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.590343 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.608272 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.626167 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.626221 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.626240 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.626264 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.626278 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.729739 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.729777 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.729786 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.729802 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.729813 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.753968 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-qrsdf"] Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.754451 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.757785 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.758042 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.758294 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.758419 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.771851 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.786235 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.798083 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.814082 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53713843-d62b-411e-908c-18f9452f6bf2-serviceca\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.814142 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbjh\" (UniqueName: \"kubernetes.io/projected/53713843-d62b-411e-908c-18f9452f6bf2-kube-api-access-8vbjh\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.814191 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53713843-d62b-411e-908c-18f9452f6bf2-host\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.815400 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.825602 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.833730 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.833798 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.833810 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.833834 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.833845 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.849809 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.866365 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.884449 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.911928 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.915608 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53713843-d62b-411e-908c-18f9452f6bf2-serviceca\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.915681 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbjh\" (UniqueName: \"kubernetes.io/projected/53713843-d62b-411e-908c-18f9452f6bf2-kube-api-access-8vbjh\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.915723 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53713843-d62b-411e-908c-18f9452f6bf2-host\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.915793 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/53713843-d62b-411e-908c-18f9452f6bf2-host\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.916779 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/53713843-d62b-411e-908c-18f9452f6bf2-serviceca\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.937049 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.937115 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.937125 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.937145 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.937157 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:14Z","lastTransitionTime":"2026-03-20T10:55:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.940602 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.954852 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbjh\" (UniqueName: \"kubernetes.io/projected/53713843-d62b-411e-908c-18f9452f6bf2-kube-api-access-8vbjh\") pod \"node-ca-qrsdf\" (UID: \"53713843-d62b-411e-908c-18f9452f6bf2\") " pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.959488 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.977817 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:14 crc kubenswrapper[4695]: I0320 10:55:14.992751 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.007764 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.021571 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.045427 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.045902 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.045970 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.045995 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.046027 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.079856 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qrsdf" Mar 20 10:55:15 crc kubenswrapper[4695]: W0320 10:55:15.109364 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53713843_d62b_411e_908c_18f9452f6bf2.slice/crio-0a871a8372110cf97993e4f9d1f0376c6f8f8d07f341fd1396bcf3a6ee569587 WatchSource:0}: Error finding container 0a871a8372110cf97993e4f9d1f0376c6f8f8d07f341fd1396bcf3a6ee569587: Status 404 returned error can't find the container with id 0a871a8372110cf97993e4f9d1f0376c6f8f8d07f341fd1396bcf3a6ee569587 Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.150783 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.150831 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.150847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.150867 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.150883 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.257766 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.257817 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.257833 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.257855 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.257869 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.301782 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qrsdf" event={"ID":"53713843-d62b-411e-908c-18f9452f6bf2","Type":"ContainerStarted","Data":"0a871a8372110cf97993e4f9d1f0376c6f8f8d07f341fd1396bcf3a6ee569587"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.307014 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.307722 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.307744 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.307755 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.312864 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerStarted","Data":"28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.331734 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.347930 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.351924 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.361168 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.361226 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.361240 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.361258 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.361271 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.368713 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.394333 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.415059 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.431758 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.453130 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.464323 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.464387 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.464400 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.464423 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.464435 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.468358 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.483838 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.499503 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.512797 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.526131 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.537054 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.547822 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.560345 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.567337 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.567379 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.567393 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.567409 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.567423 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.574219 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.587904 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.601848 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.620819 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.633715 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.648164 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.664091 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.670470 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.670516 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.670529 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.670550 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.670563 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.678139 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.697258 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.708383 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.737012 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.755890 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.772239 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.781465 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.781516 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.781528 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.781546 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.781558 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.790263 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.807080 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.821718 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.886933 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:15 crc kubenswrapper[4695]: E0320 10:55:15.887488 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.887075 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:15 crc kubenswrapper[4695]: E0320 10:55:15.887591 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.887030 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:15 crc kubenswrapper[4695]: E0320 10:55:15.887691 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.889267 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.889355 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.889372 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.889397 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.889410 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.992673 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.992718 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.992728 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.992745 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:15 crc kubenswrapper[4695]: I0320 10:55:15.992756 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:15Z","lastTransitionTime":"2026-03-20T10:55:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.096257 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.096307 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.096319 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.096338 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.096349 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.199679 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.199717 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.199728 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.199749 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.199761 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.303170 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.303228 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.303243 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.303267 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.303284 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.317753 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qrsdf" event={"ID":"53713843-d62b-411e-908c-18f9452f6bf2","Type":"ContainerStarted","Data":"ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.321403 4695 generic.go:334] "Generic (PLEG): container finished" podID="0500a369-efac-495f-83aa-8b400fd54206" containerID="28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893" exitCode=0 Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.321463 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerDied","Data":"28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.336856 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.351587 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.370297 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.385373 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.402807 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.405354 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.405471 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.405555 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.405646 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.405730 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.421497 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.438672 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.460295 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.471856 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.493351 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.508867 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.508934 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.508947 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.508963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.508975 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.509987 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.525762 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.543679 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.556107 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.571886 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.585555 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.600834 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.613364 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.613394 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.613404 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.613429 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.613442 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.619978 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.632596 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.649363 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.663478 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.677576 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.695963 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.708188 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.716453 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.716508 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.716520 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.716540 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.716554 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.724996 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.743083 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.758712 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.783959 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.797743 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.818263 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.819893 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.819946 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.819961 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.819986 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.820002 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.923108 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.923666 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.923683 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.923704 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:16 crc kubenswrapper[4695]: I0320 10:55:16.923717 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:16Z","lastTransitionTime":"2026-03-20T10:55:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.026602 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.026650 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.026660 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.026679 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.026694 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.130023 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.130069 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.130081 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.130097 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.130107 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.232457 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.232506 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.232516 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.232535 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.232547 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334265 4695 generic.go:334] "Generic (PLEG): container finished" podID="0500a369-efac-495f-83aa-8b400fd54206" containerID="81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7" exitCode=0 Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334419 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerDied","Data":"81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334485 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334549 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334561 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.334603 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.356818 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.379995 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.394038 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.408036 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.435025 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.437861 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.437947 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.437963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.437984 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.437998 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.452981 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.469243 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.487180 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.509030 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.524429 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.537411 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.541084 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.541153 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.541168 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.541214 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.541228 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.548848 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.563363 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.576672 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.593876 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:17Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.643701 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.643735 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.643746 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.643764 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.643776 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.746767 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.747253 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.747268 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.747293 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.747307 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.849993 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.850047 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.850060 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.850103 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.850117 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.886418 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.886459 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.886523 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:17 crc kubenswrapper[4695]: E0320 10:55:17.886606 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:17 crc kubenswrapper[4695]: E0320 10:55:17.886705 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:17 crc kubenswrapper[4695]: E0320 10:55:17.886787 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.953061 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.953105 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.953117 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.953138 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:17 crc kubenswrapper[4695]: I0320 10:55:17.953151 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:17Z","lastTransitionTime":"2026-03-20T10:55:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.060522 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.060573 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.060583 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.060603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.060615 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.163452 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.163504 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.163515 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.163537 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.163551 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.266751 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.266792 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.266806 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.266824 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.266836 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.342580 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" event={"ID":"0500a369-efac-495f-83aa-8b400fd54206","Type":"ContainerStarted","Data":"308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.359491 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.369967 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.370029 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.370041 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.370060 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.370073 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.380296 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.395023 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.410733 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.427378 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.441636 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.457220 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.473209 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.473278 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.473297 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.473320 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.473340 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.474789 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.494038 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.509454 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.541374 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.557832 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.572410 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.577268 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.577332 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.577346 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.577371 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.577385 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.590442 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.613203 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:18Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.681224 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.681287 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.681300 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.681331 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.681344 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.783760 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.783817 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.783831 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.783851 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.783864 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.886890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.886948 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.886958 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.886974 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.886984 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.989652 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.989708 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.989722 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.989745 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:18 crc kubenswrapper[4695]: I0320 10:55:18.989758 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:18Z","lastTransitionTime":"2026-03-20T10:55:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.092692 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.092735 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.092745 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.092762 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.092773 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.195782 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.195847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.195862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.195886 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.195901 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.298925 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.298998 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.299014 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.299037 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.299048 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.349223 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/0.log" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.353216 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7" exitCode=1 Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.353264 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.354122 4695 scope.go:117] "RemoveContainer" containerID="ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.375367 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.396988 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.404775 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.404845 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.404859 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.404896 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.404955 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.415308 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.431119 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.446109 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.461427 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.477699 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.490443 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.506120 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.506749 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.506823 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.506833 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.506852 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.506861 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.526367 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.542098 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.556101 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.572733 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.594988 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"message\\\":\\\".io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.974155 6346 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.975307 6346 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:55:18.975326 6346 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:55:18.975348 6346 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:55:18.975358 6346 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:55:18.975374 6346 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:55:18.975396 6346 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:55:18.975413 6346 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:55:18.975418 6346 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:55:18.975423 6346 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:18.975439 6346 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:18.975453 6346 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:55:18.975456 6346 factory.go:656] Stopping watch factory\\\\nI0320 10:55:18.975469 6346 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.609980 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.610568 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.610618 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.610634 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.610656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.610669 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.712810 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.712862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.712875 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.712892 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.712924 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.814871 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.814931 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.814941 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.814954 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.814965 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.886489 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.886524 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.886545 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:19 crc kubenswrapper[4695]: E0320 10:55:19.886671 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:19 crc kubenswrapper[4695]: E0320 10:55:19.886762 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:19 crc kubenswrapper[4695]: E0320 10:55:19.886955 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.917774 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.917844 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.917864 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.917887 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:19 crc kubenswrapper[4695]: I0320 10:55:19.917926 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:19Z","lastTransitionTime":"2026-03-20T10:55:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.021143 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.021223 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.021240 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.021263 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.021278 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.124409 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.124475 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.124488 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.124505 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.124518 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.227070 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.227122 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.227134 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.227153 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.227166 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.330444 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.330502 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.330512 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.330535 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.330551 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.380045 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/0.log" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.383457 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.384792 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.403096 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.419195 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.433004 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.433063 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.433075 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.433092 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.433104 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.455244 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.480378 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.519481 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.535863 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.535903 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.535943 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.535963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.535975 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.538642 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.552111 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.569342 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.581223 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.594994 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.613557 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.628948 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.639092 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.639162 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.639173 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.639187 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.639198 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.651614 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"message\\\":\\\".io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.974155 6346 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.975307 6346 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:55:18.975326 6346 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:55:18.975348 6346 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:55:18.975358 6346 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:55:18.975374 6346 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:55:18.975396 6346 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:55:18.975413 6346 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:55:18.975418 6346 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:55:18.975423 6346 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:18.975439 6346 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:18.975453 6346 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:55:18.975456 6346 factory.go:656] Stopping watch factory\\\\nI0320 10:55:18.975469 6346 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.665875 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.685806 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.742849 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.742891 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.742919 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.742944 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.742960 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.803148 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r"] Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.803829 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.806582 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.807965 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.820698 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.832152 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.846083 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.846402 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.846466 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.846531 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.846607 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.848536 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.867419 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.885329 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.890083 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftmjs\" (UniqueName: \"kubernetes.io/projected/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-kube-api-access-ftmjs\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.890154 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.890192 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.890239 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.902225 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.918092 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.941143 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"message\\\":\\\".io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.974155 6346 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.975307 6346 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:55:18.975326 6346 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:55:18.975348 6346 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:55:18.975358 6346 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:55:18.975374 6346 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:55:18.975396 6346 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:55:18.975413 6346 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:55:18.975418 6346 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:55:18.975423 6346 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:18.975439 6346 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:18.975453 6346 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:55:18.975456 6346 factory.go:656] Stopping watch factory\\\\nI0320 10:55:18.975469 6346 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.949379 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.949438 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.949449 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.949466 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.949478 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:20Z","lastTransitionTime":"2026-03-20T10:55:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.956028 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.972151 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.990717 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.990789 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftmjs\" (UniqueName: \"kubernetes.io/projected/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-kube-api-access-ftmjs\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.990837 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.990863 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.992073 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.992286 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-env-overrides\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:20 crc kubenswrapper[4695]: I0320 10:55:20.995034 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.005612 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.018208 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.020523 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftmjs\" (UniqueName: \"kubernetes.io/projected/7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8-kube-api-access-ftmjs\") pod \"ovnkube-control-plane-749d76644c-85t4r\" (UID: \"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.034088 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.046497 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.053112 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.053166 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.053178 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.053195 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.053205 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.060107 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.072407 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.117076 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" Mar 20 10:55:21 crc kubenswrapper[4695]: W0320 10:55:21.139187 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f8a6fc9_0bc7_4086_9436_8c5a3cfcb5f8.slice/crio-c1277e2343416adddff4e0f6b2a558e32e5552c4ef0438f8c594cd3f9a3df45f WatchSource:0}: Error finding container c1277e2343416adddff4e0f6b2a558e32e5552c4ef0438f8c594cd3f9a3df45f: Status 404 returned error can't find the container with id c1277e2343416adddff4e0f6b2a558e32e5552c4ef0438f8c594cd3f9a3df45f Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.155529 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.155570 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.155579 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.155596 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.155611 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.260799 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.260883 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.260926 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.260954 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.260987 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.364288 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.364348 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.364362 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.364384 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.364398 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.389465 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/1.log" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.390059 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/0.log" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.393443 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0" exitCode=1 Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.393537 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.393590 4695 scope.go:117] "RemoveContainer" containerID="ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.394289 4695 scope.go:117] "RemoveContainer" containerID="6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0" Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.394468 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.395937 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" event={"ID":"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8","Type":"ContainerStarted","Data":"c1277e2343416adddff4e0f6b2a558e32e5552c4ef0438f8c594cd3f9a3df45f"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.409043 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.424401 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.439462 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.455136 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.468427 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.468725 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.468747 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.468757 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.468774 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.468786 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.480360 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.492802 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.505955 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.522362 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.543574 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"message\\\":\\\".io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.974155 6346 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.975307 6346 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:55:18.975326 6346 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:55:18.975348 6346 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:55:18.975358 6346 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:55:18.975374 6346 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:55:18.975396 6346 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:55:18.975413 6346 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:55:18.975418 6346 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:55:18.975423 6346 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:18.975439 6346 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:18.975453 6346 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:55:18.975456 6346 factory.go:656] Stopping watch factory\\\\nI0320 10:55:18.975469 6346 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.554941 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-h5s76"] Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.555949 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.556047 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.558868 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.572307 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.572634 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.572697 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.572706 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.572725 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.572737 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.596136 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8ndm\" (UniqueName: \"kubernetes.io/projected/e0468323-460e-4bf3-be74-9c2330bde834-kube-api-access-k8ndm\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.596202 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.603665 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.621266 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.637508 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.654747 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.666796 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.682181 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.682231 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.682243 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.682263 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.682273 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.684544 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.697189 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8ndm\" (UniqueName: \"kubernetes.io/projected/e0468323-460e-4bf3-be74-9c2330bde834-kube-api-access-k8ndm\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.697651 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.698000 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.698167 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:22.198115107 +0000 UTC m=+99.978720840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.699550 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.715835 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8ndm\" (UniqueName: \"kubernetes.io/projected/e0468323-460e-4bf3-be74-9c2330bde834-kube-api-access-k8ndm\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.718214 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.737724 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.762837 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.786147 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.788191 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.788233 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.788262 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.788279 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.788290 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.802901 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.821372 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.843748 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"message\\\":\\\".io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.974155 6346 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.975307 6346 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:55:18.975326 6346 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:55:18.975348 6346 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:55:18.975358 6346 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:55:18.975374 6346 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:55:18.975396 6346 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:55:18.975413 6346 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:55:18.975418 6346 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:55:18.975423 6346 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:18.975439 6346 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:18.975453 6346 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:55:18.975456 6346 factory.go:656] Stopping watch factory\\\\nI0320 10:55:18.975469 6346 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.856973 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.869610 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.885802 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.886340 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.886373 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.886580 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.886763 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.886897 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:21 crc kubenswrapper[4695]: E0320 10:55:21.887034 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.890789 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.890847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.890862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.890883 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.890898 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.899778 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.912728 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.930098 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.942617 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:21Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.993868 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.993945 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.993961 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.993985 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:21 crc kubenswrapper[4695]: I0320 10:55:21.994002 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:21Z","lastTransitionTime":"2026-03-20T10:55:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.096415 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.096462 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.096475 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.096493 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.096507 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.199698 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.199758 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.199770 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.199793 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.199807 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.202189 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:22 crc kubenswrapper[4695]: E0320 10:55:22.202361 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:22 crc kubenswrapper[4695]: E0320 10:55:22.202446 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:23.202423577 +0000 UTC m=+100.983029150 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.302821 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.302877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.302889 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.302947 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.302958 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.401571 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" event={"ID":"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8","Type":"ContainerStarted","Data":"25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.401665 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" event={"ID":"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8","Type":"ContainerStarted","Data":"d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.404082 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/1.log" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.405883 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.405998 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.406024 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.406053 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.406076 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.410543 4695 scope.go:117] "RemoveContainer" containerID="6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0" Mar 20 10:55:22 crc kubenswrapper[4695]: E0320 10:55:22.410724 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.426637 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.444735 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.458738 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.475322 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.491312 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.507217 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.508862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.508904 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.508934 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.508962 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.508978 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.525092 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.541530 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.560309 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ff0bb3a6b6d587894c481fbba688a80d6d88bcb60c695aa39c148f435072aad7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"message\\\":\\\".io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.974155 6346 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:55:18.975307 6346 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:55:18.975326 6346 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:55:18.975348 6346 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:55:18.975358 6346 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:55:18.975374 6346 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:55:18.975396 6346 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:55:18.975413 6346 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:55:18.975418 6346 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:55:18.975423 6346 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:18.975439 6346 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:18.975453 6346 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:55:18.975456 6346 factory.go:656] Stopping watch factory\\\\nI0320 10:55:18.975469 6346 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.572828 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.586161 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.611357 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.611402 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.611415 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.611432 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.611444 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.612250 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.629338 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.644475 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.660781 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.676182 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.689709 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.705761 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.713488 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.713551 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.713561 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.713578 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.713589 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.723801 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.741861 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.756203 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.769827 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.783037 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.801772 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.816621 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.816703 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.816717 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.816736 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.816748 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.821448 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.832206 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.843755 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.867336 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.882813 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.886031 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:22 crc kubenswrapper[4695]: E0320 10:55:22.886211 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.898226 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.913649 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.918924 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.918973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.918985 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.919002 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.919016 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:22Z","lastTransitionTime":"2026-03-20T10:55:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.928582 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.939600 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.953531 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.974482 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:22 crc kubenswrapper[4695]: I0320 10:55:22.989356 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.002119 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.016589 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.021646 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.021702 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.021713 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.021734 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.021747 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.043956 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.057066 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.075316 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.090599 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.105707 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.123876 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.124473 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.124526 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.124549 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.124574 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.124592 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.137021 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.152127 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.168274 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.182135 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.195490 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.208073 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.212573 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:23 crc kubenswrapper[4695]: E0320 10:55:23.212781 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:23 crc kubenswrapper[4695]: E0320 10:55:23.212892 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:25.212864375 +0000 UTC m=+102.993469938 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.225931 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.227411 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.227443 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.227456 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.227476 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.227491 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.330252 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.330299 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.330317 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.330335 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.330349 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.434013 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.434050 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.434061 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.434075 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.434085 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.536412 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.536460 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.536474 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.536493 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.536506 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.639752 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.639802 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.639820 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.639845 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.639863 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.743170 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.743229 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.743244 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.743267 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.743282 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.846317 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.847005 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.847037 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.847065 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.847087 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.886752 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.886801 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.886896 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:23 crc kubenswrapper[4695]: E0320 10:55:23.887055 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:23 crc kubenswrapper[4695]: E0320 10:55:23.887187 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:23 crc kubenswrapper[4695]: E0320 10:55:23.887309 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.950089 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.950143 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.950154 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.950174 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:23 crc kubenswrapper[4695]: I0320 10:55:23.950192 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:23Z","lastTransitionTime":"2026-03-20T10:55:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.026009 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.026054 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.026070 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.026086 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.026098 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.039111 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.043368 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.043403 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.043416 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.043436 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.043450 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.057201 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.061667 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.061719 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.061734 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.061758 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.061774 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.075645 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.079988 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.080038 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.080049 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.080064 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.080077 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.093188 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.097118 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.097160 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.097170 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.097189 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.097199 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.112145 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.112296 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.114092 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.114151 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.114164 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.114183 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.114195 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.216508 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.216547 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.216557 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.216573 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.216582 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.319203 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.319241 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.319329 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.319366 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.319381 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.425092 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.425156 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.425169 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.425188 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.425202 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.527839 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.527932 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.527942 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.527967 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.527985 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.630899 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.630964 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.630974 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.630987 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.630997 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.733975 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.734039 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.734055 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.734079 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.734098 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.836960 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.837013 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.837022 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.837044 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.837054 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.886558 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:24 crc kubenswrapper[4695]: E0320 10:55:24.886737 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.939433 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.939484 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.939496 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.939512 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:24 crc kubenswrapper[4695]: I0320 10:55:24.939528 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:24Z","lastTransitionTime":"2026-03-20T10:55:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.042788 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.043103 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.043132 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.043163 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.043179 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.145807 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.145862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.145873 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.145894 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.145932 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.233739 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.233889 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.233974 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:29.233955408 +0000 UTC m=+107.014560971 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.250569 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.250616 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.250628 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.250647 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.250660 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.352825 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.352865 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.352874 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.352890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.352901 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.455647 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.455697 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.455706 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.455722 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.455735 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.558546 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.558600 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.558613 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.558632 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.558643 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.661157 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.661196 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.661206 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.661221 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.661232 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.763862 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.763896 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.763921 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.763935 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.763945 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.839420 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.839606 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:55:57.839577932 +0000 UTC m=+135.620183495 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.839664 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.839713 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.839787 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.839797 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.839809 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.839869 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:57.839850839 +0000 UTC m=+135.620456412 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.839958 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.839997 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:57.839989863 +0000 UTC m=+135.620595426 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840056 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840067 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840078 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840109 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:57.840102406 +0000 UTC m=+135.620707959 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840220 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840240 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840253 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.840291 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:57.840280531 +0000 UTC m=+135.620886194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.867256 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.867316 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.867328 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.867349 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.867363 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.886555 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.886606 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.886631 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.886715 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.886840 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:25 crc kubenswrapper[4695]: E0320 10:55:25.886883 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.970541 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.970602 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.970611 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.970645 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:25 crc kubenswrapper[4695]: I0320 10:55:25.970656 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:25Z","lastTransitionTime":"2026-03-20T10:55:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.073418 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.073495 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.073520 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.073548 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.073569 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.175982 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.176031 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.176041 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.176056 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.176066 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.279166 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.279216 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.279226 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.279244 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.279255 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.382159 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.382207 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.382217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.382236 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.382248 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.485525 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.485597 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.485622 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.485653 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.485676 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.588141 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.588218 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.588237 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.588261 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.588278 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.691641 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.691688 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.691699 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.691718 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.691733 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.794032 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.794087 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.794108 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.794132 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.794149 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.886159 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:26 crc kubenswrapper[4695]: E0320 10:55:26.886432 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.896110 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.896157 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.896167 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.896183 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.896194 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.998575 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.998625 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.998636 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.998655 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:26 crc kubenswrapper[4695]: I0320 10:55:26.998666 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:26Z","lastTransitionTime":"2026-03-20T10:55:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.102198 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.102254 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.102267 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.102289 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.102302 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.205334 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.205393 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.205407 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.205430 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.205447 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.308379 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.308415 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.308426 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.308441 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.308450 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.410737 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.410778 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.410787 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.410799 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.410809 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.513808 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.513876 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.513898 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.513961 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.513985 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.616424 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.616475 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.616488 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.616507 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.616519 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.720032 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.720109 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.720131 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.720163 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.720187 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.823274 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.823322 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.823331 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.823349 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.823361 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.886171 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.886287 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.886287 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:27 crc kubenswrapper[4695]: E0320 10:55:27.886414 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:27 crc kubenswrapper[4695]: E0320 10:55:27.886494 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:27 crc kubenswrapper[4695]: E0320 10:55:27.886563 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.925855 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.925897 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.925925 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.925941 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:27 crc kubenswrapper[4695]: I0320 10:55:27.925955 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:27Z","lastTransitionTime":"2026-03-20T10:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.029177 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.029249 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.029259 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.029278 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.029290 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.131839 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.131887 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.131897 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.131931 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.131943 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.235394 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.235476 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.235491 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.235518 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.235534 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.338583 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.338631 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.338656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.338672 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.338683 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.440852 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.440926 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.440947 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.440969 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.440986 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.543597 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.543653 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.543663 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.543679 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.543690 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.646219 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.646273 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.646282 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.646296 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.646305 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.748723 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.748766 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.748777 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.748792 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.748806 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.851845 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.851942 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.851953 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.851967 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.851981 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.886924 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:28 crc kubenswrapper[4695]: E0320 10:55:28.887095 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.953925 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.953952 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.953960 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.953972 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:28 crc kubenswrapper[4695]: I0320 10:55:28.953980 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:28Z","lastTransitionTime":"2026-03-20T10:55:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.057283 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.057339 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.057356 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.057382 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.057401 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.160366 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.160428 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.160439 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.160455 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.160466 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.264230 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.264320 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.264335 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.264354 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.264366 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.278161 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:29 crc kubenswrapper[4695]: E0320 10:55:29.278302 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:29 crc kubenswrapper[4695]: E0320 10:55:29.278363 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:37.278345483 +0000 UTC m=+115.058951046 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.367483 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.367539 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.367549 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.367565 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.367576 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.470644 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.470696 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.470708 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.470728 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.470738 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.579355 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.579413 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.579423 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.579440 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.579454 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.682164 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.682224 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.682235 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.682253 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.682264 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.784544 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.784632 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.784647 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.784667 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.784679 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.886070 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.886111 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.886111 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:29 crc kubenswrapper[4695]: E0320 10:55:29.886300 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:29 crc kubenswrapper[4695]: E0320 10:55:29.886388 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:29 crc kubenswrapper[4695]: E0320 10:55:29.886982 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.887709 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.887828 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.887896 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.888020 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.888103 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.991038 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.991099 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.991111 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.991130 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:29 crc kubenswrapper[4695]: I0320 10:55:29.991140 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:29Z","lastTransitionTime":"2026-03-20T10:55:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.094359 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.094427 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.094440 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.094461 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.094473 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.197496 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.197535 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.197544 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.197557 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.197566 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.300555 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.300626 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.300646 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.300668 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.300682 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.403869 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.403947 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.403956 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.403970 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.403980 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.506790 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.506834 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.506847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.506866 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.506879 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.609944 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.610298 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.610384 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.610465 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.610530 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.713368 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.713703 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.713796 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.713891 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.714001 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.817441 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.817831 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.817935 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.818043 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.818206 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.886368 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:30 crc kubenswrapper[4695]: E0320 10:55:30.887114 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.921507 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.921569 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.921588 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.921611 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:30 crc kubenswrapper[4695]: I0320 10:55:30.921628 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:30Z","lastTransitionTime":"2026-03-20T10:55:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.024445 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.024508 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.024526 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.024550 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.024568 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.127014 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.127074 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.127086 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.127111 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.127125 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.230291 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.230347 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.230359 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.230382 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.230396 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.333582 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.333648 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.333660 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.333681 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.333694 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.436973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.437044 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.437053 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.437080 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.437092 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.540020 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.540073 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.540090 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.540110 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.540128 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.643577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.643615 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.643625 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.643664 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.643677 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.746994 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.747140 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.747166 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.747198 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.747222 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.849839 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.849932 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.849951 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.849978 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.849996 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.886554 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.886607 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.886676 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:31 crc kubenswrapper[4695]: E0320 10:55:31.886756 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:31 crc kubenswrapper[4695]: E0320 10:55:31.886876 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:31 crc kubenswrapper[4695]: E0320 10:55:31.887071 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.952843 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.952883 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.952894 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.952925 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:31 crc kubenswrapper[4695]: I0320 10:55:31.952939 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:31Z","lastTransitionTime":"2026-03-20T10:55:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.055213 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.055271 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.055288 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.055306 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.055318 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.158472 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.158534 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.158553 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.158581 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.158617 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.261094 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.261150 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.261162 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.261180 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.261191 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.370179 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.370237 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.370250 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.370271 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.370282 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.472903 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.472968 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.472983 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.473000 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.473012 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.575810 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.575854 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.575866 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.575882 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.575895 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.677865 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.677972 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.677985 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.678002 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.678013 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.782608 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.782679 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.782691 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.782718 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.782735 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.885864 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.885901 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.885925 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.885943 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.885953 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.886079 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:32 crc kubenswrapper[4695]: E0320 10:55:32.886259 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.903155 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.918225 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.932218 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.949384 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.961718 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.979141 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.989386 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.989423 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.989432 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.989463 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:32 crc kubenswrapper[4695]: I0320 10:55:32.989475 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:32Z","lastTransitionTime":"2026-03-20T10:55:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.005360 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.018274 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.032441 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.054879 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.072414 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.088703 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.093242 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.093274 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.093285 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.093303 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.093314 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.104123 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.117780 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.129874 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.151425 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.171280 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.195827 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.195936 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.195958 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.195988 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.196005 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.299383 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.299451 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.299461 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.299483 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.299493 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.408017 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.408092 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.408106 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.408132 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.408147 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.511876 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.511941 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.511954 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.511970 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.511982 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.614650 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.614766 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.614792 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.614823 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.614844 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.718031 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.718108 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.718124 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.718143 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.718156 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.821578 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.821647 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.821662 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.821687 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.821704 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.885994 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.886007 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.886023 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:33 crc kubenswrapper[4695]: E0320 10:55:33.886155 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:33 crc kubenswrapper[4695]: E0320 10:55:33.886424 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:33 crc kubenswrapper[4695]: E0320 10:55:33.886467 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.925049 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.925107 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.925120 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.925141 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:33 crc kubenswrapper[4695]: I0320 10:55:33.925159 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:33Z","lastTransitionTime":"2026-03-20T10:55:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.028423 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.028490 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.028505 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.028527 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.028540 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.131751 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.131813 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.131828 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.131853 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.131869 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.234714 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.234778 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.234791 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.234816 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.234858 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.265608 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.265663 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.265688 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.265708 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.265718 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.281538 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.285930 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.285995 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.286008 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.286036 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.286050 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.298326 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.303134 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.303210 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.303221 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.303263 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.303277 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.315814 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.320578 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.320625 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.320638 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.320658 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.320670 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.333279 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.337149 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.337202 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.337217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.337238 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.337250 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.348998 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.349142 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.351068 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.351101 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.351112 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.351130 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.351146 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.453437 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.453494 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.453508 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.453533 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.453550 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.556135 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.556173 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.556183 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.556200 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.556210 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.658386 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.658444 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.658457 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.658474 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.658486 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.761608 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.761656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.761670 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.761690 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.761704 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.864193 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.864248 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.864261 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.864280 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.864319 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.886588 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:34 crc kubenswrapper[4695]: E0320 10:55:34.886734 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.887697 4695 scope.go:117] "RemoveContainer" containerID="6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.967266 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.967805 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.967847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.967877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:34 crc kubenswrapper[4695]: I0320 10:55:34.967935 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:34Z","lastTransitionTime":"2026-03-20T10:55:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.072657 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.072732 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.072757 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.072789 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.072814 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.175596 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.175646 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.175656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.175673 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.175686 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.278636 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.278693 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.278705 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.278728 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.278740 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.381749 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.381799 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.381813 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.381835 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.381848 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.463861 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/1.log" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.467280 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.467820 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.479988 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.484563 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.484614 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.484627 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.484649 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.484666 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.494461 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.505948 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.522533 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.543528 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.560248 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.574633 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.587273 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.587839 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.587876 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.587888 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.587921 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.587932 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.607584 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.618357 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.631062 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.641832 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.654797 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.668148 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.680715 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.690698 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.690764 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.690773 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.690787 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.690798 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.697721 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.708899 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.793744 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.793795 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.793805 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.793824 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.793838 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.886324 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.886364 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.886404 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:35 crc kubenswrapper[4695]: E0320 10:55:35.886509 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:35 crc kubenswrapper[4695]: E0320 10:55:35.886635 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:35 crc kubenswrapper[4695]: E0320 10:55:35.886771 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.896656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.896707 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.896720 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.896738 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.896749 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.999458 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.999519 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.999531 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.999549 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:35 crc kubenswrapper[4695]: I0320 10:55:35.999561 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:35Z","lastTransitionTime":"2026-03-20T10:55:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.102220 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.102271 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.102287 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.102308 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.102324 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.205537 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.205602 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.205636 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.205667 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.205691 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.309199 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.309242 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.309252 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.309270 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.309281 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.412755 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.412802 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.412811 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.412826 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.412836 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.477212 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/2.log" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.478367 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/1.log" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.481705 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d" exitCode=1 Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.481767 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.481826 4695 scope.go:117] "RemoveContainer" containerID="6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.483657 4695 scope.go:117] "RemoveContainer" containerID="4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d" Mar 20 10:55:36 crc kubenswrapper[4695]: E0320 10:55:36.484143 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.497402 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.513339 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.515833 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.515959 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.515973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.515994 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.516312 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.529517 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.552244 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.576058 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.594249 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.608154 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.620302 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.620349 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.620367 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.620391 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.620407 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.627595 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.649676 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ca146416501b5149e392a19955444d0679d445ec98b41acb7c6f7c720c924c0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"message\\\":\\\"work=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524670 6581 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:55:20.524676 6581 services_controller.go:445] Built service openshift-apiserver/check-endpoints LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 10:55:20.524700 6581 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:55:20.524707 6581 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:55:20.524699 6581 services_controller.go:451] Built service openshift-apiserver/check-endpoints cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0320 10:55:20.524718 6581 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:55:20.524787 6581 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.664082 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.678304 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.694553 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.710756 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.723310 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.723344 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.723356 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.723370 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.723379 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.724818 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.737798 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.753762 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.767658 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.826546 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.826640 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.826666 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.826700 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.826730 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.886380 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:36 crc kubenswrapper[4695]: E0320 10:55:36.886672 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.929130 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.929184 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.929204 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.929224 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:36 crc kubenswrapper[4695]: I0320 10:55:36.929238 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:36Z","lastTransitionTime":"2026-03-20T10:55:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.031720 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.031791 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.031806 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.031828 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.031848 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.134659 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.134714 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.134729 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.134751 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.134765 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.238159 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.238227 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.238242 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.238269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.238286 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.342056 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.342107 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.342118 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.342140 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.342155 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.364382 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:37 crc kubenswrapper[4695]: E0320 10:55:37.364609 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:37 crc kubenswrapper[4695]: E0320 10:55:37.364715 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:55:53.364669836 +0000 UTC m=+131.145275409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.445417 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.445491 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.445511 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.445535 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.445553 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.487455 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/2.log" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.491389 4695 scope.go:117] "RemoveContainer" containerID="4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d" Mar 20 10:55:37 crc kubenswrapper[4695]: E0320 10:55:37.491549 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.508111 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.527576 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.542751 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.548067 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.548127 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.548139 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.548160 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.548174 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.559221 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.576644 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.592827 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.606793 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.621650 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.634992 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.650382 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.650435 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.650449 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.650466 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.650478 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.651846 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.664839 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.678421 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.711714 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.726512 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.742713 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.752932 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.752985 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.753003 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.753028 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.753046 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.761433 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.782642 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.860497 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.860531 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.860543 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.860560 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.860569 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.886043 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:37 crc kubenswrapper[4695]: E0320 10:55:37.886233 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.886300 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:37 crc kubenswrapper[4695]: E0320 10:55:37.886434 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.886317 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:37 crc kubenswrapper[4695]: E0320 10:55:37.886550 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.963368 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.963432 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.963445 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.963464 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:37 crc kubenswrapper[4695]: I0320 10:55:37.963477 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:37Z","lastTransitionTime":"2026-03-20T10:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.066147 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.066201 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.066209 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.066224 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.066233 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.169433 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.169498 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.169509 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.169526 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.169538 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.272471 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.272529 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.272543 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.272564 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.272577 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.375297 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.375359 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.375370 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.375391 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.375406 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.478210 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.478269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.478282 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.478305 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.478322 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.581263 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.581317 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.581334 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.581352 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.581365 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.683843 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.683890 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.683902 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.683949 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.683963 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.786761 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.786821 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.786830 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.786847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.786863 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.887079 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:38 crc kubenswrapper[4695]: E0320 10:55:38.887319 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.890188 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.890271 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.890285 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.890305 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.890407 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.994623 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.994690 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.994702 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.994724 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:38 crc kubenswrapper[4695]: I0320 10:55:38.994737 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:38Z","lastTransitionTime":"2026-03-20T10:55:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.097949 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.098002 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.098016 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.098033 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.098044 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.200503 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.200550 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.200564 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.200582 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.200599 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.303587 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.303671 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.303692 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.303718 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.303736 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.406967 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.407028 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.407043 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.407062 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.407076 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.510321 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.510390 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.510413 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.510442 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.510466 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.614109 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.614204 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.614219 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.614245 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.614259 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.717646 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.717797 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.717814 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.717837 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.717850 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.821129 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.821185 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.821199 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.821217 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.821229 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.886694 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.886788 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.886713 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:39 crc kubenswrapper[4695]: E0320 10:55:39.886903 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:39 crc kubenswrapper[4695]: E0320 10:55:39.887021 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:39 crc kubenswrapper[4695]: E0320 10:55:39.887214 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.924613 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.924664 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.924674 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.924690 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:39 crc kubenswrapper[4695]: I0320 10:55:39.924701 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:39Z","lastTransitionTime":"2026-03-20T10:55:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.028269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.028810 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.028824 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.028854 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.028869 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.132398 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.132764 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.132824 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.132859 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.132882 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.235892 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.235994 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.236015 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.236047 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.236067 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.338345 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.338402 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.338416 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.338433 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.338448 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.441845 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.441894 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.441904 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.441936 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.441946 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.544638 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.544687 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.544699 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.544715 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.544726 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.647963 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.648284 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.648389 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.648497 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.648578 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.751139 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.751179 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.751189 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.751204 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.751215 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.853763 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.853803 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.853814 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.853830 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.853842 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.886297 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:40 crc kubenswrapper[4695]: E0320 10:55:40.886453 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.897147 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.958860 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.958931 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.958943 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.958961 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:40 crc kubenswrapper[4695]: I0320 10:55:40.958971 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:40Z","lastTransitionTime":"2026-03-20T10:55:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.061310 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.061359 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.061372 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.061389 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.061401 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.164799 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.164847 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.164859 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.164877 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.164887 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.268025 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.268054 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.268063 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.268079 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.268088 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.370816 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.370898 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.370932 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.370955 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.370969 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.473669 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.473727 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.473739 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.473756 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.473769 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.577227 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.577283 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.577299 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.577322 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.577334 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.679898 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.679982 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.679994 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.680015 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.680030 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.782324 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.782383 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.782395 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.782410 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.782419 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886002 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886091 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886102 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886101 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886122 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886121 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886142 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.886388 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:41 crc kubenswrapper[4695]: E0320 10:55:41.886425 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:41 crc kubenswrapper[4695]: E0320 10:55:41.886666 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:41 crc kubenswrapper[4695]: E0320 10:55:41.887038 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.989399 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.989455 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.989473 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.989492 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:41 crc kubenswrapper[4695]: I0320 10:55:41.989508 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:41Z","lastTransitionTime":"2026-03-20T10:55:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.091889 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.091960 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.091973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.091992 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.092002 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.194415 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.194478 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.194491 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.194508 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.194521 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.297479 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.297536 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.297548 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.297566 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.297580 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.401843 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.401942 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.401967 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.401990 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.402001 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.505148 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.505279 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.505319 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.505358 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.505371 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.607893 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.607950 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.607959 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.607974 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.607983 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.711236 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.711364 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.711389 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.711491 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.711520 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.815244 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.815285 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.815294 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.815310 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.815321 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:42Z","lastTransitionTime":"2026-03-20T10:55:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.886681 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:42 crc kubenswrapper[4695]: E0320 10:55:42.886851 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.901316 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.915241 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: E0320 10:55:42.916210 4695 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.929697 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.941472 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.953700 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.967487 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.980198 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:42 crc kubenswrapper[4695]: I0320 10:55:42.990765 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.003540 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.015731 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.033671 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: E0320 10:55:43.045067 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.052287 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.072888 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.086617 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.097711 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.111967 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.131442 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.143053 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.886877 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.886973 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:43 crc kubenswrapper[4695]: I0320 10:55:43.886893 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:43 crc kubenswrapper[4695]: E0320 10:55:43.887064 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:43 crc kubenswrapper[4695]: E0320 10:55:43.887229 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:43 crc kubenswrapper[4695]: E0320 10:55:43.887331 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.640798 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.640833 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.640844 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.640860 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.640873 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:44Z","lastTransitionTime":"2026-03-20T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.654726 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.660525 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.660571 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.660582 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.660597 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.660606 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:44Z","lastTransitionTime":"2026-03-20T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.675135 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.680401 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.680454 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.680465 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.680486 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.680499 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:44Z","lastTransitionTime":"2026-03-20T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.694222 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.699354 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.699733 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.699871 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.700000 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.700080 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:44Z","lastTransitionTime":"2026-03-20T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.713818 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.717959 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.718056 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.718069 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.718087 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.718121 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:44Z","lastTransitionTime":"2026-03-20T10:55:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.732773 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:44Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.733371 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:44 crc kubenswrapper[4695]: I0320 10:55:44.886502 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:44 crc kubenswrapper[4695]: E0320 10:55:44.886704 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:45 crc kubenswrapper[4695]: I0320 10:55:45.886394 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:45 crc kubenswrapper[4695]: I0320 10:55:45.886442 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:45 crc kubenswrapper[4695]: E0320 10:55:45.886561 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:45 crc kubenswrapper[4695]: I0320 10:55:45.886576 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:45 crc kubenswrapper[4695]: E0320 10:55:45.886619 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:45 crc kubenswrapper[4695]: E0320 10:55:45.886686 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:46 crc kubenswrapper[4695]: I0320 10:55:46.886128 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:46 crc kubenswrapper[4695]: E0320 10:55:46.886333 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:47 crc kubenswrapper[4695]: I0320 10:55:47.886131 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:47 crc kubenswrapper[4695]: I0320 10:55:47.886131 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:47 crc kubenswrapper[4695]: I0320 10:55:47.886213 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:47 crc kubenswrapper[4695]: E0320 10:55:47.886625 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:47 crc kubenswrapper[4695]: E0320 10:55:47.886880 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:47 crc kubenswrapper[4695]: E0320 10:55:47.886791 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:48 crc kubenswrapper[4695]: E0320 10:55:48.046341 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:55:48 crc kubenswrapper[4695]: I0320 10:55:48.887109 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:48 crc kubenswrapper[4695]: E0320 10:55:48.887312 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:49 crc kubenswrapper[4695]: I0320 10:55:49.887050 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:49 crc kubenswrapper[4695]: I0320 10:55:49.887050 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:49 crc kubenswrapper[4695]: I0320 10:55:49.887084 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:49 crc kubenswrapper[4695]: E0320 10:55:49.887271 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:49 crc kubenswrapper[4695]: E0320 10:55:49.887385 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:49 crc kubenswrapper[4695]: E0320 10:55:49.887458 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:50 crc kubenswrapper[4695]: I0320 10:55:50.886956 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:50 crc kubenswrapper[4695]: E0320 10:55:50.887109 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:51 crc kubenswrapper[4695]: I0320 10:55:51.886306 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:51 crc kubenswrapper[4695]: I0320 10:55:51.886364 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:51 crc kubenswrapper[4695]: I0320 10:55:51.886332 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:51 crc kubenswrapper[4695]: E0320 10:55:51.886894 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:51 crc kubenswrapper[4695]: E0320 10:55:51.886945 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:51 crc kubenswrapper[4695]: E0320 10:55:51.887020 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:51 crc kubenswrapper[4695]: I0320 10:55:51.887385 4695 scope.go:117] "RemoveContainer" containerID="4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d" Mar 20 10:55:51 crc kubenswrapper[4695]: E0320 10:55:51.887723 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.886881 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:52 crc kubenswrapper[4695]: E0320 10:55:52.887117 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.902685 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.919706 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.932810 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.948025 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.960419 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.978088 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:52 crc kubenswrapper[4695]: I0320 10:55:52.989694 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.001989 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.024855 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.039434 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: E0320 10:55:53.047281 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.058681 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.072497 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.093307 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.104879 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.119657 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.137801 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.155600 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.170801 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:53Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.439193 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:53 crc kubenswrapper[4695]: E0320 10:55:53.439379 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:53 crc kubenswrapper[4695]: E0320 10:55:53.439509 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:25.439480481 +0000 UTC m=+163.220086084 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.886858 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.886935 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:53 crc kubenswrapper[4695]: I0320 10:55:53.887075 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:53 crc kubenswrapper[4695]: E0320 10:55:53.887201 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:53 crc kubenswrapper[4695]: E0320 10:55:53.887371 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:53 crc kubenswrapper[4695]: E0320 10:55:53.887460 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:54 crc kubenswrapper[4695]: I0320 10:55:54.886690 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:54 crc kubenswrapper[4695]: E0320 10:55:54.886893 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.014348 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.014409 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.014427 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.014452 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.014468 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.030743 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.035705 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.035748 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.035761 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.035779 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.035790 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.049551 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.054973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.055020 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.055030 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.055049 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.055060 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.067557 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.072191 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.072234 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.072244 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.072262 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.072273 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.086380 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.090158 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.090192 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.090202 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.090218 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.090228 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:55:55Z","lastTransitionTime":"2026-03-20T10:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.103402 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:55Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.103565 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.886539 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.886614 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:55 crc kubenswrapper[4695]: I0320 10:55:55.886679 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.886798 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.886891 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:55 crc kubenswrapper[4695]: E0320 10:55:55.887203 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:56 crc kubenswrapper[4695]: I0320 10:55:56.886417 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:56 crc kubenswrapper[4695]: E0320 10:55:56.886623 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.886334 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.886418 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.886484 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.886542 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.886686 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.886723 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.886789 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.886872 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.886971 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:01.886934885 +0000 UTC m=+199.667540448 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.887011 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.887052 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887064 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: I0320 10:55:57.887077 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887082 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887117 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887121 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887159 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:01.88714507 +0000 UTC m=+199.667750633 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887176 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:01.887167901 +0000 UTC m=+199.667773464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887236 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887282 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887307 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887323 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887334 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:01.887304514 +0000 UTC m=+199.667910217 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:55:57 crc kubenswrapper[4695]: E0320 10:55:57.887392 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:01.887370355 +0000 UTC m=+199.667976088 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:55:58 crc kubenswrapper[4695]: E0320 10:55:58.048428 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.694525 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/0.log" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.694596 4695 generic.go:334] "Generic (PLEG): container finished" podID="52301735-de4f-4672-9e4d-6bd74bccedad" containerID="4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e" exitCode=1 Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.694643 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerDied","Data":"4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e"} Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.695258 4695 scope.go:117] "RemoveContainer" containerID="4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.707886 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.721059 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.745802 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.762128 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.780041 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.795856 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.815969 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.830451 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.846549 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.862187 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.879715 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.886745 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:55:58 crc kubenswrapper[4695]: E0320 10:55:58.886946 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.896568 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.914948 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.931362 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.945408 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.962834 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.975376 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:58 crc kubenswrapper[4695]: I0320 10:55:58.994191 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:58Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.700990 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/0.log" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.701068 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerStarted","Data":"05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806"} Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.717725 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.739072 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.753229 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.770055 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.790829 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.807755 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.823677 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.838816 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.859469 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.874562 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.886760 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.886808 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.886854 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:55:59 crc kubenswrapper[4695]: E0320 10:55:59.886981 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:55:59 crc kubenswrapper[4695]: E0320 10:55:59.887187 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:55:59 crc kubenswrapper[4695]: E0320 10:55:59.887291 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.889497 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.904453 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.919383 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.935760 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.951337 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.963870 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.978872 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:55:59 crc kubenswrapper[4695]: I0320 10:55:59.993945 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:59Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:00 crc kubenswrapper[4695]: I0320 10:56:00.886599 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:00 crc kubenswrapper[4695]: E0320 10:56:00.886775 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:01 crc kubenswrapper[4695]: I0320 10:56:01.886487 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:01 crc kubenswrapper[4695]: E0320 10:56:01.886697 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:01 crc kubenswrapper[4695]: I0320 10:56:01.886856 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:01 crc kubenswrapper[4695]: I0320 10:56:01.886857 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:01 crc kubenswrapper[4695]: E0320 10:56:01.887072 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:01 crc kubenswrapper[4695]: E0320 10:56:01.887239 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.886893 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:02 crc kubenswrapper[4695]: E0320 10:56:02.887696 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.892432 4695 scope.go:117] "RemoveContainer" containerID="4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.914716 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:02Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.926866 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:02Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.938967 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:02Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.961132 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:02Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.976188 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:02Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:02 crc kubenswrapper[4695]: I0320 10:56:02.991749 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:02Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.007283 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.023207 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.041022 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: E0320 10:56:03.048980 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.060671 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.076640 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.092470 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.108117 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.122991 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.134660 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.146836 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.160034 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.177595 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.717881 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/2.log" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.722098 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.722758 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.738198 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.753773 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.768544 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.785058 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.802869 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.823982 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.838781 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.856636 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.871233 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.887077 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.887125 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:03 crc kubenswrapper[4695]: E0320 10:56:03.887286 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.887507 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:03 crc kubenswrapper[4695]: E0320 10:56:03.887895 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:03 crc kubenswrapper[4695]: E0320 10:56:03.888202 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.892466 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.927422 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.943112 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.973032 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:03 crc kubenswrapper[4695]: I0320 10:56:03.995085 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:03Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.009300 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.024937 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.048041 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.064192 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.080309 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.728220 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/3.log" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.728894 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/2.log" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.733514 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" exitCode=1 Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.733623 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.733698 4695 scope.go:117] "RemoveContainer" containerID="4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.734483 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 10:56:04 crc kubenswrapper[4695]: E0320 10:56:04.734713 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.751744 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.767326 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.779364 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.794704 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.807001 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.825304 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.851610 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4992cdd2d101bccf764244535fda9b15ae017e0c1cb4b4fd221999a605d8cc7d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:35Z\\\",\\\"message\\\":\\\"0:55:35.765482 6815 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 10:55:35.765488 6815 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 10:55:35.765503 6815 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 10:55:35.765485 6815 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765509 6815 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0320 10:55:35.765522 6815 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0320 10:55:35.765530 6815 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf in node crc\\\\nI0320 10:55:35.765561 6815 base_network_controller_pods.go:477] [default/openshift-network-diagnostics/network-check-source-55646444c4-trplf] creating logical port openshift-network-diagnostics_network-check-source-55646444c4-trplf for pod on switch crc\\\\nF0320 10:55:35.765601 6815 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:03Z\\\",\\\"message\\\":\\\"ork=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 10:56:03.970299 7146 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.864625 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.877686 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.886997 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:04 crc kubenswrapper[4695]: E0320 10:56:04.887189 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.902418 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.918770 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.935451 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.950310 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.965425 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.978591 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:04 crc kubenswrapper[4695]: I0320 10:56:04.994260 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1939bad4-97af-4184-bbe4-36f87795a4a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014756681bda842f91150730dfafc4c4cbcd129a645af7882f07a36b9a11243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9848381ec62a8e9785a42314a255bc22c04cc2f52153c5ea0b1f797b56282bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:53:45.815668 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:53:45.818152 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:53:45.895681 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:53:45.901047 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:54:15.135198 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:54:15.135311 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:14Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b382ffbd0c95442e6641029e57a71d00d56cdb073b590b5c154a807591cdc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77dab970ee3a33336d4a6f9830ed7c5fd19537a6dae4bcae01448402fd595ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.007415 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.020830 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.037048 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.236853 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.236956 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.236971 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.236995 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.237010 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.253792 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.259094 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.259147 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.259158 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.259180 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.259195 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.274982 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.282937 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.283005 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.283017 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.283037 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.283051 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.297772 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.303201 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.303269 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.303285 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.303310 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.303327 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.319066 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.324590 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.324647 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.324656 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.324673 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.324687 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:05Z","lastTransitionTime":"2026-03-20T10:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.339087 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:05Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.339273 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.739218 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/3.log" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.742929 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.743171 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.759093 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.773922 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.786322 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.797690 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.807261 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.822562 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1939bad4-97af-4184-bbe4-36f87795a4a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014756681bda842f91150730dfafc4c4cbcd129a645af7882f07a36b9a11243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9848381ec62a8e9785a42314a255bc22c04cc2f52153c5ea0b1f797b56282bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:53:45.815668 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:53:45.818152 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:53:45.895681 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:53:45.901047 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:54:15.135198 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:54:15.135311 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:14Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b382ffbd0c95442e6641029e57a71d00d56cdb073b590b5c154a807591cdc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77dab970ee3a33336d4a6f9830ed7c5fd19537a6dae4bcae01448402fd595ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.835645 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.853511 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.868816 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.883312 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.886473 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.886561 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.886604 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.886719 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.886776 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:05 crc kubenswrapper[4695]: E0320 10:56:05.886834 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.902686 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.916415 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.932601 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.947514 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.961522 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.984615 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:03Z\\\",\\\"message\\\":\\\"ork=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 10:56:03.970299 7146 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:05 crc kubenswrapper[4695]: I0320 10:56:05.998549 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:05Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4695]: I0320 10:56:06.011600 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4695]: I0320 10:56:06.035816 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:06 crc kubenswrapper[4695]: I0320 10:56:06.886810 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:06 crc kubenswrapper[4695]: E0320 10:56:06.887075 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:07 crc kubenswrapper[4695]: I0320 10:56:07.886947 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:07 crc kubenswrapper[4695]: I0320 10:56:07.887033 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:07 crc kubenswrapper[4695]: E0320 10:56:07.887126 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:07 crc kubenswrapper[4695]: I0320 10:56:07.886958 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:07 crc kubenswrapper[4695]: E0320 10:56:07.887299 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:07 crc kubenswrapper[4695]: E0320 10:56:07.887401 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:08 crc kubenswrapper[4695]: E0320 10:56:08.050787 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:08 crc kubenswrapper[4695]: I0320 10:56:08.886664 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:08 crc kubenswrapper[4695]: E0320 10:56:08.887348 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:09 crc kubenswrapper[4695]: I0320 10:56:09.887069 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:09 crc kubenswrapper[4695]: I0320 10:56:09.887069 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:09 crc kubenswrapper[4695]: I0320 10:56:09.887067 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:09 crc kubenswrapper[4695]: E0320 10:56:09.887338 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:09 crc kubenswrapper[4695]: E0320 10:56:09.887578 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:09 crc kubenswrapper[4695]: E0320 10:56:09.887669 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:10 crc kubenswrapper[4695]: I0320 10:56:10.886835 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:10 crc kubenswrapper[4695]: E0320 10:56:10.887037 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:11 crc kubenswrapper[4695]: I0320 10:56:11.887206 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:11 crc kubenswrapper[4695]: I0320 10:56:11.887206 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:11 crc kubenswrapper[4695]: E0320 10:56:11.887469 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:11 crc kubenswrapper[4695]: I0320 10:56:11.887585 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:11 crc kubenswrapper[4695]: E0320 10:56:11.887731 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:11 crc kubenswrapper[4695]: E0320 10:56:11.887841 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.887048 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:12 crc kubenswrapper[4695]: E0320 10:56:12.887214 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.903248 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.920343 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.935392 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.948060 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.961879 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.978461 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:12 crc kubenswrapper[4695]: I0320 10:56:12.999698 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:03Z\\\",\\\"message\\\":\\\"ork=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 10:56:03.970299 7146 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:12Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.011070 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.024557 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.046523 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: E0320 10:56:13.051544 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.063124 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.075239 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.088578 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.102055 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.113291 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.127352 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1939bad4-97af-4184-bbe4-36f87795a4a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014756681bda842f91150730dfafc4c4cbcd129a645af7882f07a36b9a11243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9848381ec62a8e9785a42314a255bc22c04cc2f52153c5ea0b1f797b56282bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:53:45.815668 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:53:45.818152 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:53:45.895681 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:53:45.901047 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:54:15.135198 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:54:15.135311 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:14Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b382ffbd0c95442e6641029e57a71d00d56cdb073b590b5c154a807591cdc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77dab970ee3a33336d4a6f9830ed7c5fd19537a6dae4bcae01448402fd595ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.139874 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.156072 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.167590 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:13Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.887165 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.887183 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:13 crc kubenswrapper[4695]: E0320 10:56:13.887716 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:13 crc kubenswrapper[4695]: E0320 10:56:13.887789 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:13 crc kubenswrapper[4695]: I0320 10:56:13.887278 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:13 crc kubenswrapper[4695]: E0320 10:56:13.888021 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:14 crc kubenswrapper[4695]: I0320 10:56:14.886986 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:14 crc kubenswrapper[4695]: E0320 10:56:14.887296 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.505160 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.505207 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.505219 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.505236 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.505248 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.520786 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.525618 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.525683 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.525702 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.525725 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.525738 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.541473 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.546930 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.546977 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.546987 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.547002 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.547013 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.562276 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.567397 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.567442 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.567456 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.567479 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.567496 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.581704 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.587865 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.587958 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.587973 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.587996 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.588010 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:15Z","lastTransitionTime":"2026-03-20T10:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.602974 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:15Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:15Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.603156 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.886787 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.886969 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:15 crc kubenswrapper[4695]: I0320 10:56:15.887004 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.887126 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.887480 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:15 crc kubenswrapper[4695]: E0320 10:56:15.887637 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:16 crc kubenswrapper[4695]: I0320 10:56:16.887271 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:16 crc kubenswrapper[4695]: E0320 10:56:16.887502 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:17 crc kubenswrapper[4695]: I0320 10:56:17.886553 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:17 crc kubenswrapper[4695]: I0320 10:56:17.886557 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:17 crc kubenswrapper[4695]: E0320 10:56:17.886715 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:17 crc kubenswrapper[4695]: E0320 10:56:17.886934 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:17 crc kubenswrapper[4695]: I0320 10:56:17.887306 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:17 crc kubenswrapper[4695]: E0320 10:56:17.887792 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:18 crc kubenswrapper[4695]: E0320 10:56:18.053326 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:18 crc kubenswrapper[4695]: I0320 10:56:18.886318 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:18 crc kubenswrapper[4695]: E0320 10:56:18.886475 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:19 crc kubenswrapper[4695]: I0320 10:56:19.886742 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:19 crc kubenswrapper[4695]: I0320 10:56:19.886890 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:19 crc kubenswrapper[4695]: E0320 10:56:19.886968 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:19 crc kubenswrapper[4695]: I0320 10:56:19.887104 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:19 crc kubenswrapper[4695]: E0320 10:56:19.887359 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:19 crc kubenswrapper[4695]: E0320 10:56:19.887468 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:19 crc kubenswrapper[4695]: I0320 10:56:19.888422 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 10:56:19 crc kubenswrapper[4695]: E0320 10:56:19.888626 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:56:20 crc kubenswrapper[4695]: I0320 10:56:20.887026 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:20 crc kubenswrapper[4695]: E0320 10:56:20.887258 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:21 crc kubenswrapper[4695]: I0320 10:56:21.886713 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:21 crc kubenswrapper[4695]: I0320 10:56:21.886765 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:21 crc kubenswrapper[4695]: I0320 10:56:21.886813 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:21 crc kubenswrapper[4695]: E0320 10:56:21.886919 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:21 crc kubenswrapper[4695]: E0320 10:56:21.887070 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:21 crc kubenswrapper[4695]: E0320 10:56:21.887127 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.886761 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:22 crc kubenswrapper[4695]: E0320 10:56:22.886947 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.902639 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.918698 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.935042 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.955833 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.968658 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:22 crc kubenswrapper[4695]: I0320 10:56:22.991128 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:22Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.015369 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.032348 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.048662 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: E0320 10:56:23.053891 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.069123 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.092453 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:03Z\\\",\\\"message\\\":\\\"ork=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 10:56:03.970299 7146 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.105799 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.115764 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.127249 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1939bad4-97af-4184-bbe4-36f87795a4a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014756681bda842f91150730dfafc4c4cbcd129a645af7882f07a36b9a11243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9848381ec62a8e9785a42314a255bc22c04cc2f52153c5ea0b1f797b56282bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:53:45.815668 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:53:45.818152 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:53:45.895681 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:53:45.901047 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:54:15.135198 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:54:15.135311 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:14Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b382ffbd0c95442e6641029e57a71d00d56cdb073b590b5c154a807591cdc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77dab970ee3a33336d4a6f9830ed7c5fd19537a6dae4bcae01448402fd595ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.138514 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.154350 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.168517 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.179241 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.187408 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:23Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.886769 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.886824 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:23 crc kubenswrapper[4695]: I0320 10:56:23.886769 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:23 crc kubenswrapper[4695]: E0320 10:56:23.886938 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:23 crc kubenswrapper[4695]: E0320 10:56:23.887104 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:23 crc kubenswrapper[4695]: E0320 10:56:23.887781 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:24 crc kubenswrapper[4695]: I0320 10:56:24.886779 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:24 crc kubenswrapper[4695]: E0320 10:56:24.887113 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.507754 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.507969 4695 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.508058 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs podName:e0468323-460e-4bf3-be74-9c2330bde834 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:29.508034903 +0000 UTC m=+227.288640486 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs") pod "network-metrics-daemon-h5s76" (UID: "e0468323-460e-4bf3-be74-9c2330bde834") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.886525 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.886526 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.886698 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.886840 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.886558 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.887083 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.950739 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.950801 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.950813 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.950831 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.950844 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.964927 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.969199 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.969252 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.969268 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.969289 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.969302 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4695]: E0320 10:56:25.982244 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.986543 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.986577 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.986588 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.986603 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4695]: I0320 10:56:25.986615 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4695]: E0320 10:56:26.000320 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:25Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.005165 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.005209 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.005221 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.005247 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.005261 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4695]: E0320 10:56:26.020153 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.024174 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.024240 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.024258 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.024280 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.024296 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:26Z","lastTransitionTime":"2026-03-20T10:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:26 crc kubenswrapper[4695]: E0320 10:56:26.037426 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:26Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"44f103d6-b2e8-4356-8f43-42f34aee1d07\\\",\\\"systemUUID\\\":\\\"0a63b4f3-f2b8-41ba-8013-dab8ba41cf9d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:26 crc kubenswrapper[4695]: E0320 10:56:26.037600 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:26 crc kubenswrapper[4695]: I0320 10:56:26.886447 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:26 crc kubenswrapper[4695]: E0320 10:56:26.886660 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:27 crc kubenswrapper[4695]: I0320 10:56:27.886005 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:27 crc kubenswrapper[4695]: I0320 10:56:27.886095 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:27 crc kubenswrapper[4695]: I0320 10:56:27.886114 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:27 crc kubenswrapper[4695]: E0320 10:56:27.886224 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:27 crc kubenswrapper[4695]: E0320 10:56:27.886421 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:27 crc kubenswrapper[4695]: E0320 10:56:27.886509 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:28 crc kubenswrapper[4695]: E0320 10:56:28.055209 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:28 crc kubenswrapper[4695]: I0320 10:56:28.886400 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:28 crc kubenswrapper[4695]: E0320 10:56:28.886582 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:29 crc kubenswrapper[4695]: I0320 10:56:29.887083 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:29 crc kubenswrapper[4695]: I0320 10:56:29.887083 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:29 crc kubenswrapper[4695]: E0320 10:56:29.887308 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:29 crc kubenswrapper[4695]: E0320 10:56:29.887499 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:29 crc kubenswrapper[4695]: I0320 10:56:29.887113 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:29 crc kubenswrapper[4695]: E0320 10:56:29.887635 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:30 crc kubenswrapper[4695]: I0320 10:56:30.887125 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:30 crc kubenswrapper[4695]: E0320 10:56:30.887405 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:31 crc kubenswrapper[4695]: I0320 10:56:31.886452 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:31 crc kubenswrapper[4695]: I0320 10:56:31.886452 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:31 crc kubenswrapper[4695]: I0320 10:56:31.886483 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:31 crc kubenswrapper[4695]: E0320 10:56:31.886737 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:31 crc kubenswrapper[4695]: E0320 10:56:31.886813 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:31 crc kubenswrapper[4695]: E0320 10:56:31.886900 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.886327 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:32 crc kubenswrapper[4695]: E0320 10:56:32.886486 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.902792 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.915493 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7859c924-84d7-4855-901e-c77a02c56e3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0e5887fc79b70074f9817ea07e98371d4542b1e36fc056118c241cf2653f4ad2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lw6vr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-bnwz5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.930304 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0500a369-efac-495f-83aa-8b400fd54206\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://308a0ad5e55245d181191a0ed8b1dfd84d8aa18e32d617bed8e3611679fc5b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://013719fb092a4675333e21111b9b9821e50612ac578ef7549bdf567514824e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84f3d8c8f8dd633d5ff77c823052d868836a49ea126a552017bd6c2871369cc6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e26301330250280c52c6efbce507afa64ded570826186b978ebcc586f85dbc0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://27c6b10d3c6e602f397916200d49742c525ff7c500b1ead3ca8b8eacd20af648\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28aae648a4ff018fd2337daf5f1d85a6328243fe3bf8a767582a582cb50a3893\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fee5be4c7e9f5982d25018de61e39f354a6f8963c7313c112b6e04728cfbd7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tpk42\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-6jlvp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.940415 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b6dedae7-0780-4343-8917-0d02e749404e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://18dda7582db85da130b9e95842a979ea3696e8951545c323761c2df21261f5ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a76c3d115fec42c8e3d261597460fe860574b6ae44a1caf88b640e9a811e7d84\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.952043 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:12Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b73cec7ed7dc87e880ba3c6299287b49d19cb07ef2d942cb4425c24e90f232d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.962504 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-75zwx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"548a9f17-19d1-4267-a179-75a82fe79a42\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d808e5ec474231c28bbf3de52b114a6f0c38a6443c96a38ea515d7fdfc0de797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gvhmk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:07Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-75zwx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:32 crc kubenswrapper[4695]: I0320 10:56:32.977845 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-hg7g5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"52301735-de4f-4672-9e4d-6bd74bccedad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:55:57Z\\\",\\\"message\\\":\\\"2026-03-20T10:55:12+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03\\\\n2026-03-20T10:55:12+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_b3b7c2b7-f029-4942-80e6-a588e7ac9d03 to /host/opt/cni/bin/\\\\n2026-03-20T10:55:12Z [verbose] multus-daemon started\\\\n2026-03-20T10:55:12Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:55:57Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:08Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-69bmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-multus\"/\"multus-hg7g5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.000716 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7010d107-c3b1-4cc2-83c2-523df13ecd43\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:08Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:03Z\\\",\\\"message\\\":\\\"ork=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.93\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 10:56:03.970299 7146 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qdz7d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:08Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-nx4bc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:32Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.012207 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qrsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"53713843-d62b-411e-908c-18f9452f6bf2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ecd827c2bd1743e23f1f976847aab27852abff1a9f4bd54950e6d359d2ee2533\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8vbjh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qrsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.028931 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7f8a6fc9-0bc7-4086-9436-8c5a3cfcb5f8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d8e6a699ee6242e5555c1c7029f5f66e1ff2d87d2115fdaf7edf95489007330f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://25f834ec041a5e5fd186567eed4651136f52f2a785f2eee742e12da0470a1dae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ftmjs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:20Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-85t4r\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.049733 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40a74534-a320-4528-913a-0d82cfe9baac\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://59d471cbb419ceb98d07f80c5853c1d013b7a5a55161beeae7658fb765278c9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86c13e4dc6a7422dba5ef8ec9b0adf1a4a2c2f3fdaa6ad574179f18e2341ccc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4440e83fa9a56569e669c35e7c6babfbee0e9b4c0bc727cbfbb24809cbd7a7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dcc9adc9343ac0372f313dd74e923525bf6fcbd9aa4a82b660d4407eaa8a79a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://662f0fdbfdfe1b65bc8f6936f178fde1336bdc3ea7525d420f6dfddb2829b736\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6d444bc1b00c1ac5cf533970748eca0d76f828da130262ec046b9414c3e3366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://853c37cab92d3df958b96298bd548590511cc1e2396df5294b88905c9ac5aa85\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a73569747ac32cf649f4e2c1792ef7f318bd471b106c2311c894388e069a6d0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: E0320 10:56:33.055968 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.069735 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"65c5b07a-a076-493a-8d05-5b297c74da55\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:23Z\\\",\\\"message\\\":\\\"W0320 10:54:23.004573 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 10:54:23.005039 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774004063 cert, and key in /tmp/serving-cert-3705297579/serving-signer.crt, /tmp/serving-cert-3705297579/serving-signer.key\\\\nI0320 10:54:23.396455 1 observer_polling.go:159] Starting file observer\\\\nW0320 10:54:23.405538 1 builder.go:272] unable to get owner reference (falling back to namespace): Unauthorized\\\\nI0320 10:54:23.405669 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:54:23.406481 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3705297579/tls.crt::/tmp/serving-cert-3705297579/tls.key\\\\\\\"\\\\nF0320 10:54:23.589644 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:54:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.087251 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c6c286c4ad4a3ad9893db464d4602ca29be0979cdbeb6703d34c465eedc4fe8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.101274 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:07Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b899a7bc5a092204058cc00b3bc995c59cdc2d0755d9a0c9862c8336a4a6481c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebc0146bd15251a24e287c3ab04c7a6f5aa00f3dd61397b0a9de0909746b64a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.118349 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.134480 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-h5s76" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0468323-460e-4bf3-be74-9c2330bde834\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k8ndm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:21Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-h5s76\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.147522 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1939bad4-97af-4184-bbe4-36f87795a4a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://014756681bda842f91150730dfafc4c4cbcd129a645af7882f07a36b9a11243f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e9848381ec62a8e9785a42314a255bc22c04cc2f52153c5ea0b1f797b56282bd\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:54:15Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:53:45.815668 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:53:45.818152 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:53:45.895681 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:53:45.901047 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:54:15.135198 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:54:15.135311 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:54:14Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:54:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://74b382ffbd0c95442e6641029e57a71d00d56cdb073b590b5c154a807591cdc4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b77dab970ee3a33336d4a6f9830ed7c5fd19537a6dae4bcae01448402fd595ba\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.160812 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cbd362cf-4525-4896-9335-1c8cda4303bb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:53:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://810780c72e3134aa544e31a393caed80dee3a1ba5db1752f5a3f77a775f84b9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://048cd0ec21ce074a74507343df7e1a0d7a1421ca70c31d8c062d836fe6891575\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8652889bb2db85da875dc4c200322271ea04d1ef7f217730f1d3e6798ff878e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:53:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ac9a539541a9bd8cbe81fbb9910dba50a7b1310c2deebc06bf4fec7d03069e4f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:53:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:53:44Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:53:43Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.175382 4695 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:54:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:33Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.886244 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.886736 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:33 crc kubenswrapper[4695]: I0320 10:56:33.886776 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:33 crc kubenswrapper[4695]: E0320 10:56:33.886992 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:33 crc kubenswrapper[4695]: E0320 10:56:33.887065 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:33 crc kubenswrapper[4695]: E0320 10:56:33.887140 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:34 crc kubenswrapper[4695]: I0320 10:56:34.886749 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:34 crc kubenswrapper[4695]: E0320 10:56:34.887070 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:34 crc kubenswrapper[4695]: I0320 10:56:34.887960 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 10:56:34 crc kubenswrapper[4695]: E0320 10:56:34.888176 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-nx4bc_openshift-ovn-kubernetes(7010d107-c3b1-4cc2-83c2-523df13ecd43)\"" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" Mar 20 10:56:35 crc kubenswrapper[4695]: I0320 10:56:35.886897 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:35 crc kubenswrapper[4695]: I0320 10:56:35.886963 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:35 crc kubenswrapper[4695]: I0320 10:56:35.886897 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:35 crc kubenswrapper[4695]: E0320 10:56:35.887089 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:35 crc kubenswrapper[4695]: E0320 10:56:35.887216 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:35 crc kubenswrapper[4695]: E0320 10:56:35.887327 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.312670 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.312748 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.312762 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.312784 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.312799 4695 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:36Z","lastTransitionTime":"2026-03-20T10:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.374300 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk"] Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.374861 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.377590 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.378217 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.378304 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.379191 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.391852 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=95.391819179 podStartE2EDuration="1m35.391819179s" podCreationTimestamp="2026-03-20 10:55:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.391820399 +0000 UTC m=+174.172425962" watchObservedRunningTime="2026-03-20 10:56:36.391819179 +0000 UTC m=+174.172424742" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.429882 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0b7e7eae-1979-450d-bc30-baec053d279c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.429964 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e7eae-1979-450d-bc30-baec053d279c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.429988 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e7eae-1979-450d-bc30-baec053d279c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.430014 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0b7e7eae-1979-450d-bc30-baec053d279c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.430052 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b7e7eae-1979-450d-bc30-baec053d279c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.442479 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6jlvp" podStartSLOduration=130.442458241 podStartE2EDuration="2m10.442458241s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.442043982 +0000 UTC m=+174.222649555" watchObservedRunningTime="2026-03-20 10:56:36.442458241 +0000 UTC m=+174.223063804" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.442831 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-75zwx" podStartSLOduration=131.44282478 podStartE2EDuration="2m11.44282478s" podCreationTimestamp="2026-03-20 10:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.420780619 +0000 UTC m=+174.201386182" watchObservedRunningTime="2026-03-20 10:56:36.44282478 +0000 UTC m=+174.223430343" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.496363 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qrsdf" podStartSLOduration=131.496338948 podStartE2EDuration="2m11.496338948s" podCreationTimestamp="2026-03-20 10:54:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.481595597 +0000 UTC m=+174.262201150" watchObservedRunningTime="2026-03-20 10:56:36.496338948 +0000 UTC m=+174.276944511" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.521274 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=91.521244425 podStartE2EDuration="1m31.521244425s" podCreationTimestamp="2026-03-20 10:55:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.521238125 +0000 UTC m=+174.301843698" watchObservedRunningTime="2026-03-20 10:56:36.521244425 +0000 UTC m=+174.301849988" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.521678 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-85t4r" podStartSLOduration=130.521670625 podStartE2EDuration="2m10.521670625s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.496538443 +0000 UTC m=+174.277144006" watchObservedRunningTime="2026-03-20 10:56:36.521670625 +0000 UTC m=+174.302276188" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.531266 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e7eae-1979-450d-bc30-baec053d279c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.531332 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0b7e7eae-1979-450d-bc30-baec053d279c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.531361 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e7eae-1979-450d-bc30-baec053d279c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.531386 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0b7e7eae-1979-450d-bc30-baec053d279c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.531447 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b7e7eae-1979-450d-bc30-baec053d279c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.531872 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0b7e7eae-1979-450d-bc30-baec053d279c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.532020 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0b7e7eae-1979-450d-bc30-baec053d279c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.532782 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b7e7eae-1979-450d-bc30-baec053d279c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.539107 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b7e7eae-1979-450d-bc30-baec053d279c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.546443 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=102.546408737 podStartE2EDuration="1m42.546408737s" podCreationTimestamp="2026-03-20 10:54:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.544479773 +0000 UTC m=+174.325085356" watchObservedRunningTime="2026-03-20 10:56:36.546408737 +0000 UTC m=+174.327014310" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.551213 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b7e7eae-1979-450d-bc30-baec053d279c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4qlqk\" (UID: \"0b7e7eae-1979-450d-bc30-baec053d279c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.606354 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hg7g5" podStartSLOduration=130.606335094 podStartE2EDuration="2m10.606335094s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.582237837 +0000 UTC m=+174.362843400" watchObservedRunningTime="2026-03-20 10:56:36.606335094 +0000 UTC m=+174.386940657" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.635841 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=33.635808356 podStartE2EDuration="33.635808356s" podCreationTimestamp="2026-03-20 10:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.635295945 +0000 UTC m=+174.415901508" watchObservedRunningTime="2026-03-20 10:56:36.635808356 +0000 UTC m=+174.416413919" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.651389 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.651368057 podStartE2EDuration="56.651368057s" podCreationTimestamp="2026-03-20 10:55:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.651282745 +0000 UTC m=+174.431888328" watchObservedRunningTime="2026-03-20 10:56:36.651368057 +0000 UTC m=+174.431973610" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.691697 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.731415 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podStartSLOduration=130.731392499 podStartE2EDuration="2m10.731392499s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:36.729819612 +0000 UTC m=+174.510425175" watchObservedRunningTime="2026-03-20 10:56:36.731392499 +0000 UTC m=+174.511998052" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.849836 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" event={"ID":"0b7e7eae-1979-450d-bc30-baec053d279c","Type":"ContainerStarted","Data":"2f1633be0b4896bea52fe7b6caa32fb7919f94cda310f05a9785bca04235771f"} Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.886551 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:36 crc kubenswrapper[4695]: E0320 10:56:36.886747 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.948739 4695 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 10:56:36 crc kubenswrapper[4695]: I0320 10:56:36.959628 4695 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:56:37 crc kubenswrapper[4695]: I0320 10:56:37.854750 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" event={"ID":"0b7e7eae-1979-450d-bc30-baec053d279c","Type":"ContainerStarted","Data":"26506b35732b5a41a2e1268802c3fe8a980d8870d21d94db31b4f60efdd63e15"} Mar 20 10:56:37 crc kubenswrapper[4695]: I0320 10:56:37.886071 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:37 crc kubenswrapper[4695]: I0320 10:56:37.886116 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:37 crc kubenswrapper[4695]: I0320 10:56:37.886138 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:37 crc kubenswrapper[4695]: E0320 10:56:37.886238 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:37 crc kubenswrapper[4695]: E0320 10:56:37.886353 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:37 crc kubenswrapper[4695]: E0320 10:56:37.886471 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:37 crc kubenswrapper[4695]: I0320 10:56:37.888722 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4qlqk" podStartSLOduration=131.888693265 podStartE2EDuration="2m11.888693265s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:37.888652234 +0000 UTC m=+175.669257817" watchObservedRunningTime="2026-03-20 10:56:37.888693265 +0000 UTC m=+175.669298828" Mar 20 10:56:38 crc kubenswrapper[4695]: E0320 10:56:38.057661 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:38 crc kubenswrapper[4695]: I0320 10:56:38.886829 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:38 crc kubenswrapper[4695]: E0320 10:56:38.887008 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:39 crc kubenswrapper[4695]: I0320 10:56:39.886887 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:39 crc kubenswrapper[4695]: I0320 10:56:39.887044 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:39 crc kubenswrapper[4695]: E0320 10:56:39.887065 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:39 crc kubenswrapper[4695]: I0320 10:56:39.887176 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:39 crc kubenswrapper[4695]: E0320 10:56:39.887199 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:39 crc kubenswrapper[4695]: E0320 10:56:39.887391 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:40 crc kubenswrapper[4695]: I0320 10:56:40.886055 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:40 crc kubenswrapper[4695]: E0320 10:56:40.886250 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:41 crc kubenswrapper[4695]: I0320 10:56:41.886186 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:41 crc kubenswrapper[4695]: I0320 10:56:41.886188 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:41 crc kubenswrapper[4695]: I0320 10:56:41.886186 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:41 crc kubenswrapper[4695]: E0320 10:56:41.886376 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:41 crc kubenswrapper[4695]: E0320 10:56:41.886488 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:41 crc kubenswrapper[4695]: E0320 10:56:41.886571 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:42 crc kubenswrapper[4695]: I0320 10:56:42.886045 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:42 crc kubenswrapper[4695]: E0320 10:56:42.887327 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:43 crc kubenswrapper[4695]: E0320 10:56:43.058438 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:43 crc kubenswrapper[4695]: I0320 10:56:43.886160 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:43 crc kubenswrapper[4695]: I0320 10:56:43.886168 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:43 crc kubenswrapper[4695]: I0320 10:56:43.886599 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:43 crc kubenswrapper[4695]: E0320 10:56:43.886749 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:43 crc kubenswrapper[4695]: E0320 10:56:43.886822 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:43 crc kubenswrapper[4695]: E0320 10:56:43.886951 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.879339 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/1.log" Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.881043 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/0.log" Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.881089 4695 generic.go:334] "Generic (PLEG): container finished" podID="52301735-de4f-4672-9e4d-6bd74bccedad" containerID="05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806" exitCode=1 Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.881125 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerDied","Data":"05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806"} Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.881172 4695 scope.go:117] "RemoveContainer" containerID="4a4a00d5484fe3307887c931c3eddec412c25a0b00fe4c669233f37353175a0e" Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.882169 4695 scope.go:117] "RemoveContainer" containerID="05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806" Mar 20 10:56:44 crc kubenswrapper[4695]: E0320 10:56:44.882522 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-hg7g5_openshift-multus(52301735-de4f-4672-9e4d-6bd74bccedad)\"" pod="openshift-multus/multus-hg7g5" podUID="52301735-de4f-4672-9e4d-6bd74bccedad" Mar 20 10:56:44 crc kubenswrapper[4695]: I0320 10:56:44.886216 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:44 crc kubenswrapper[4695]: E0320 10:56:44.886376 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:45 crc kubenswrapper[4695]: I0320 10:56:45.886211 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:45 crc kubenswrapper[4695]: I0320 10:56:45.886233 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:45 crc kubenswrapper[4695]: I0320 10:56:45.886288 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:45 crc kubenswrapper[4695]: E0320 10:56:45.886383 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:45 crc kubenswrapper[4695]: E0320 10:56:45.886530 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:45 crc kubenswrapper[4695]: E0320 10:56:45.886593 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:45 crc kubenswrapper[4695]: I0320 10:56:45.886839 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/1.log" Mar 20 10:56:46 crc kubenswrapper[4695]: I0320 10:56:46.886475 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:46 crc kubenswrapper[4695]: E0320 10:56:46.886687 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:47 crc kubenswrapper[4695]: I0320 10:56:47.886552 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:47 crc kubenswrapper[4695]: I0320 10:56:47.886661 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:47 crc kubenswrapper[4695]: I0320 10:56:47.886716 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:47 crc kubenswrapper[4695]: E0320 10:56:47.886879 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:47 crc kubenswrapper[4695]: E0320 10:56:47.887028 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:47 crc kubenswrapper[4695]: E0320 10:56:47.887083 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:48 crc kubenswrapper[4695]: E0320 10:56:48.060204 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:48 crc kubenswrapper[4695]: I0320 10:56:48.886267 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:48 crc kubenswrapper[4695]: E0320 10:56:48.886421 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:49 crc kubenswrapper[4695]: I0320 10:56:49.886047 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:49 crc kubenswrapper[4695]: I0320 10:56:49.886116 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:49 crc kubenswrapper[4695]: I0320 10:56:49.886385 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:49 crc kubenswrapper[4695]: E0320 10:56:49.886547 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:49 crc kubenswrapper[4695]: E0320 10:56:49.886633 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:49 crc kubenswrapper[4695]: E0320 10:56:49.886772 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:49 crc kubenswrapper[4695]: I0320 10:56:49.886840 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 10:56:50 crc kubenswrapper[4695]: I0320 10:56:50.886569 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:50 crc kubenswrapper[4695]: E0320 10:56:50.886751 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:50 crc kubenswrapper[4695]: I0320 10:56:50.908530 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/3.log" Mar 20 10:56:50 crc kubenswrapper[4695]: I0320 10:56:50.911875 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerStarted","Data":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} Mar 20 10:56:50 crc kubenswrapper[4695]: I0320 10:56:50.912448 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:56:51 crc kubenswrapper[4695]: I0320 10:56:51.101864 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podStartSLOduration=145.101831787 podStartE2EDuration="2m25.101831787s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:56:50.949224225 +0000 UTC m=+188.729829808" watchObservedRunningTime="2026-03-20 10:56:51.101831787 +0000 UTC m=+188.882437370" Mar 20 10:56:51 crc kubenswrapper[4695]: I0320 10:56:51.102676 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h5s76"] Mar 20 10:56:51 crc kubenswrapper[4695]: I0320 10:56:51.102825 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:51 crc kubenswrapper[4695]: E0320 10:56:51.102961 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:51 crc kubenswrapper[4695]: I0320 10:56:51.886408 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:51 crc kubenswrapper[4695]: E0320 10:56:51.887281 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:51 crc kubenswrapper[4695]: I0320 10:56:51.886669 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:51 crc kubenswrapper[4695]: I0320 10:56:51.886604 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:51 crc kubenswrapper[4695]: E0320 10:56:51.887410 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:51 crc kubenswrapper[4695]: E0320 10:56:51.887624 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:52 crc kubenswrapper[4695]: I0320 10:56:52.887014 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:52 crc kubenswrapper[4695]: E0320 10:56:52.889116 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:53 crc kubenswrapper[4695]: E0320 10:56:53.061058 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:53 crc kubenswrapper[4695]: I0320 10:56:53.887051 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:53 crc kubenswrapper[4695]: E0320 10:56:53.887240 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:53 crc kubenswrapper[4695]: I0320 10:56:53.887407 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:53 crc kubenswrapper[4695]: I0320 10:56:53.887052 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:53 crc kubenswrapper[4695]: E0320 10:56:53.887634 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:53 crc kubenswrapper[4695]: E0320 10:56:53.887720 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:54 crc kubenswrapper[4695]: I0320 10:56:54.886169 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:54 crc kubenswrapper[4695]: E0320 10:56:54.886361 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:55 crc kubenswrapper[4695]: I0320 10:56:55.886704 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:55 crc kubenswrapper[4695]: I0320 10:56:55.886843 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:55 crc kubenswrapper[4695]: I0320 10:56:55.886704 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:55 crc kubenswrapper[4695]: E0320 10:56:55.886868 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:55 crc kubenswrapper[4695]: E0320 10:56:55.887084 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:55 crc kubenswrapper[4695]: E0320 10:56:55.887166 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:56 crc kubenswrapper[4695]: I0320 10:56:56.886644 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:56 crc kubenswrapper[4695]: E0320 10:56:56.886852 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:57 crc kubenswrapper[4695]: I0320 10:56:57.886452 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:57 crc kubenswrapper[4695]: I0320 10:56:57.886558 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:57 crc kubenswrapper[4695]: I0320 10:56:57.886613 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:57 crc kubenswrapper[4695]: E0320 10:56:57.886681 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:57 crc kubenswrapper[4695]: E0320 10:56:57.886961 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:57 crc kubenswrapper[4695]: E0320 10:56:57.887021 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:58 crc kubenswrapper[4695]: E0320 10:56:58.063105 4695 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:56:58 crc kubenswrapper[4695]: I0320 10:56:58.885988 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:56:58 crc kubenswrapper[4695]: E0320 10:56:58.886141 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:56:59 crc kubenswrapper[4695]: I0320 10:56:59.886783 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:59 crc kubenswrapper[4695]: I0320 10:56:59.886861 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:59 crc kubenswrapper[4695]: I0320 10:56:59.886817 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:59 crc kubenswrapper[4695]: E0320 10:56:59.887211 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:59 crc kubenswrapper[4695]: E0320 10:56:59.887507 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:59 crc kubenswrapper[4695]: E0320 10:56:59.887411 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:59 crc kubenswrapper[4695]: I0320 10:56:59.887691 4695 scope.go:117] "RemoveContainer" containerID="05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806" Mar 20 10:57:00 crc kubenswrapper[4695]: I0320 10:57:00.886502 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:57:00 crc kubenswrapper[4695]: E0320 10:57:00.887078 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:57:00 crc kubenswrapper[4695]: I0320 10:57:00.951729 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/1.log" Mar 20 10:57:00 crc kubenswrapper[4695]: I0320 10:57:00.952172 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerStarted","Data":"6288afe02e5624f347e826d8f85bfb546d5a45b435e01c0a0b3bd13018172586"} Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.887080 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.887283 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.887111 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.887376 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.887080 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.887441 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.945534 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.945790 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:59:03.945742799 +0000 UTC m=+321.726348372 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.945868 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.945952 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.945995 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:01 crc kubenswrapper[4695]: I0320 10:57:01.946014 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946060 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946083 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946099 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946138 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946153 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946163 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946152 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:59:03.94614363 +0000 UTC m=+321.726749193 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946199 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:59:03.946191361 +0000 UTC m=+321.726796924 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946215 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946236 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946359 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:59:03.946333635 +0000 UTC m=+321.726939198 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:01 crc kubenswrapper[4695]: E0320 10:57:01.946480 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:59:03.946468808 +0000 UTC m=+321.727074371 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:02 crc kubenswrapper[4695]: I0320 10:57:02.886443 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:57:02 crc kubenswrapper[4695]: E0320 10:57:02.887874 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h5s76" podUID="e0468323-460e-4bf3-be74-9c2330bde834" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.886087 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.886195 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.886289 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.889645 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.889883 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.890704 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:57:03 crc kubenswrapper[4695]: I0320 10:57:03.890947 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:57:04 crc kubenswrapper[4695]: I0320 10:57:04.886815 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:57:04 crc kubenswrapper[4695]: I0320 10:57:04.889897 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:57:04 crc kubenswrapper[4695]: I0320 10:57:04.890233 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.871429 4695 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.914345 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.914794 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.917468 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.917989 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.926797 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.928452 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.928688 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.929006 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.929153 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.945014 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.945289 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8f97r"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.945876 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rgpbm"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.946172 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r82b4"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.945873 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.946493 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.945945 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.946787 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.946988 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.946798 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.948217 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.948284 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.948543 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.949025 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.949516 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.949650 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.950344 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.950759 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.955481 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-dzwsk"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.956067 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.956636 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.958275 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ghv5z"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.958799 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.959221 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.959679 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dqr46"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.960377 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.963125 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.964300 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z8vn7"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.965341 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.966872 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x95pq"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.967317 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-92bhx"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.967747 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.968356 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.969021 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.969253 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.970572 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.970990 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.973869 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.974394 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.975068 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-s2xcj"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.975897 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.977631 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.978183 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.978390 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.978510 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.978542 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.978605 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.978949 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.981000 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.981661 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6"] Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.981787 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.982094 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.982261 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.982441 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.982989 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:57:06 crc kubenswrapper[4695]: I0320 10:57:06.987943 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8f97r"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.005960 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.008763 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.009643 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.010120 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.010964 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.020091 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.020718 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046125 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046250 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046364 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046481 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046610 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046739 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.046143 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047039 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047308 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047331 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047536 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047606 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047643 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047765 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.047822 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.048758 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.048955 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049057 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049152 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049581 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049681 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049822 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049833 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.049999 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050105 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050234 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050293 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050366 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050412 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050422 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050483 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050247 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050550 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050609 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050656 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050702 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050723 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050751 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050760 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050765 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050845 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050871 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050969 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050992 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.050709 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051096 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051155 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051183 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051099 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051103 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051260 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051216 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051315 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051450 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051480 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051566 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051594 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051652 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.051718 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.052219 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.052301 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.052544 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.057548 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.058831 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.059687 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.060092 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.060154 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.066963 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.067347 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.067489 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.067702 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.068016 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.069177 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.069527 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.074974 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.075343 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.076816 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.078080 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.079142 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gnknn"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.079320 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.079842 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.080237 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.080342 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.080608 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.080701 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.080941 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.081371 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.083660 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h8rbk"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.087571 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.093211 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.097806 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.098082 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.098506 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mmw7q"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.098590 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.099881 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.100382 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.101998 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.103360 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.104779 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.106073 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.107140 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9mvwv"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.108668 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.108707 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.108964 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7b9p"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.108779 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109243 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-service-ca\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109291 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5f41a94-dc5d-4026-983e-52e817217252-audit-dir\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109317 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-trusted-ca\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109341 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c4a374-18fd-4aac-807a-f191398b9490-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109397 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-config\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109484 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-config\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109519 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109539 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-oauth-config\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109966 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-etcd-serving-ca\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.109998 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-encryption-config\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110016 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-console-config\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110042 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-client\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110074 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110137 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c96kl\" (UniqueName: \"kubernetes.io/projected/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-kube-api-access-c96kl\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110160 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31256627-4fcd-49a5-87b1-0e52e6265720-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110221 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6g9\" (UniqueName: \"kubernetes.io/projected/acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564-kube-api-access-vw6g9\") pod \"downloads-7954f5f757-dzwsk\" (UID: \"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564\") " pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110393 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-config\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110473 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-service-ca-bundle\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110508 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-audit\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110573 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c281c741-25bd-4650-a6aa-6e33eaf3d80c-auth-proxy-config\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110606 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7q7t\" (UniqueName: \"kubernetes.io/projected/31256627-4fcd-49a5-87b1-0e52e6265720-kube-api-access-s7q7t\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110638 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-serving-cert\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110663 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f524x\" (UniqueName: \"kubernetes.io/projected/2a2b9241-d006-442c-9133-74ac83793483-kube-api-access-f524x\") pod \"cluster-samples-operator-665b6dd947-2dmjw\" (UID: \"2a2b9241-d006-442c-9133-74ac83793483\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110697 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110724 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-config\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.110805 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-ca\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111181 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-encryption-config\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111225 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32c4a374-18fd-4aac-807a-f191398b9490-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111252 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111256 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-config\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111339 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379303a6-d86c-4861-b7d5-b46f2a336fb9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111364 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/42a13b86-8970-4571-9a14-9dea1c55558f-node-pullsecrets\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111395 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111414 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc74s\" (UniqueName: \"kubernetes.io/projected/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-kube-api-access-cc74s\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111435 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-config\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111457 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-serving-cert\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111683 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111745 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4qnv\" (UniqueName: \"kubernetes.io/projected/32c4a374-18fd-4aac-807a-f191398b9490-kube-api-access-f4qnv\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111801 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p96p\" (UniqueName: \"kubernetes.io/projected/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-kube-api-access-7p96p\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.111854 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-etcd-client\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112084 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112148 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksggv\" (UniqueName: \"kubernetes.io/projected/214a7b7b-3483-4084-bf89-fffbe6e5d591-kube-api-access-ksggv\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112199 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-serving-cert\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112258 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112285 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjh6w\" (UniqueName: \"kubernetes.io/projected/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-kube-api-access-rjh6w\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112310 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-config\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112344 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4krs7\" (UniqueName: \"kubernetes.io/projected/c569b375-f808-4f3d-8f3d-a162677356ff-kube-api-access-4krs7\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112367 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112387 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42a13b86-8970-4571-9a14-9dea1c55558f-audit-dir\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112409 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112434 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112455 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdx85\" (UniqueName: \"kubernetes.io/projected/c5f41a94-dc5d-4026-983e-52e817217252-kube-api-access-cdx85\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112476 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlqn\" (UniqueName: \"kubernetes.io/projected/f1b00244-2d56-4cfc-a852-0ef9c33214e1-kube-api-access-7rlqn\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112505 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5phk7\" (UniqueName: \"kubernetes.io/projected/42a13b86-8970-4571-9a14-9dea1c55558f-kube-api-access-5phk7\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112500 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112529 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112550 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-client-ca\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112578 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-audit-policies\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112598 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-trusted-ca-bundle\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112631 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-service-ca\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112651 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4bv\" (UniqueName: \"kubernetes.io/projected/379303a6-d86c-4861-b7d5-b46f2a336fb9-kube-api-access-nc4bv\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112691 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-image-import-ca\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112731 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpjln\" (UniqueName: \"kubernetes.io/projected/874d0ff7-4923-4423-920a-59e6a632507a-kube-api-access-rpjln\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112769 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-config\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112792 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112811 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/379303a6-d86c-4861-b7d5-b46f2a336fb9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112837 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112867 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112896 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112937 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-etcd-client\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112960 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c281c741-25bd-4650-a6aa-6e33eaf3d80c-config\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.112982 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a2b9241-d006-442c-9133-74ac83793483-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2dmjw\" (UID: \"2a2b9241-d006-442c-9133-74ac83793483\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113020 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-serving-cert\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113050 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d71b917-7595-49dd-8c41-e7e642210d35-metrics-tls\") pod \"dns-operator-744455d44c-ghv5z\" (UID: \"5d71b917-7595-49dd-8c41-e7e642210d35\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113149 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113170 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214a7b7b-3483-4084-bf89-fffbe6e5d591-serving-cert\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113190 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-serving-cert\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113264 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-serving-cert\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113291 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlvpw\" (UniqueName: \"kubernetes.io/projected/5d71b917-7595-49dd-8c41-e7e642210d35-kube-api-access-qlvpw\") pod \"dns-operator-744455d44c-ghv5z\" (UID: \"5d71b917-7595-49dd-8c41-e7e642210d35\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113323 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31256627-4fcd-49a5-87b1-0e52e6265720-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113353 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113379 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pslv7\" (UniqueName: \"kubernetes.io/projected/c281c741-25bd-4650-a6aa-6e33eaf3d80c-kube-api-access-pslv7\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113423 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113458 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113484 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-audit-policies\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113529 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-serving-cert\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113567 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/31256627-4fcd-49a5-87b1-0e52e6265720-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113593 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv2gf\" (UniqueName: \"kubernetes.io/projected/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-kube-api-access-bv2gf\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113597 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113610 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-oauth-serving-cert\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113640 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-images\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113662 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-client-ca\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113688 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c569b375-f808-4f3d-8f3d-a162677356ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113707 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1b00244-2d56-4cfc-a852-0ef9c33214e1-audit-dir\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113724 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c281c741-25bd-4650-a6aa-6e33eaf3d80c-machine-approver-tls\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.113891 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-djlcz"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.114567 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.115162 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.116150 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.117781 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.117858 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566736-t45k7"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.118382 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.126106 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.128715 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-96sfc"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.129318 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.129713 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.130838 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rgpbm"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.131438 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.137317 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dzwsk"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.139351 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.140211 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-92bhx"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.149129 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ghv5z"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.149205 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.149217 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r82b4"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.159306 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.167492 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.167555 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.167578 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.171534 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z8vn7"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.184736 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.188296 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.189999 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.192802 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.198377 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.201341 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.204656 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9mvwv"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.211980 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-s2xcj"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.212040 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.212058 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h8rbk"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214558 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-etcd-client\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214618 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c281c741-25bd-4650-a6aa-6e33eaf3d80c-config\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214651 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9715a6-753c-4341-899a-e769836ad4e1-config\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214673 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnvjd\" (UniqueName: \"kubernetes.io/projected/ff9715a6-753c-4341-899a-e769836ad4e1-kube-api-access-xnvjd\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214694 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a2b9241-d006-442c-9133-74ac83793483-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2dmjw\" (UID: \"2a2b9241-d006-442c-9133-74ac83793483\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214717 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eafc357-18f4-49a8-88be-d7e67ed800a0-secret-volume\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214737 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/288d9110-491b-424c-b001-47835e23220f-signing-key\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214756 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-serving-cert\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214775 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d71b917-7595-49dd-8c41-e7e642210d35-metrics-tls\") pod \"dns-operator-744455d44c-ghv5z\" (UID: \"5d71b917-7595-49dd-8c41-e7e642210d35\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214798 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214818 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214a7b7b-3483-4084-bf89-fffbe6e5d591-serving-cert\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214839 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-serving-cert\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214861 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-serving-cert\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214888 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlvpw\" (UniqueName: \"kubernetes.io/projected/5d71b917-7595-49dd-8c41-e7e642210d35-kube-api-access-qlvpw\") pod \"dns-operator-744455d44c-ghv5z\" (UID: \"5d71b917-7595-49dd-8c41-e7e642210d35\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214935 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31256627-4fcd-49a5-87b1-0e52e6265720-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214963 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.214985 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pslv7\" (UniqueName: \"kubernetes.io/projected/c281c741-25bd-4650-a6aa-6e33eaf3d80c-kube-api-access-pslv7\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215008 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215031 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215060 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-audit-policies\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215083 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eafc357-18f4-49a8-88be-d7e67ed800a0-config-volume\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215108 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-serving-cert\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215136 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/31256627-4fcd-49a5-87b1-0e52e6265720-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215160 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv2gf\" (UniqueName: \"kubernetes.io/projected/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-kube-api-access-bv2gf\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215186 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-oauth-serving-cert\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215209 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-images\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215232 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-client-ca\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215254 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c569b375-f808-4f3d-8f3d-a162677356ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215275 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1b00244-2d56-4cfc-a852-0ef9c33214e1-audit-dir\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215298 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c281c741-25bd-4650-a6aa-6e33eaf3d80c-machine-approver-tls\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215322 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/178400b6-5c7d-4e98-9884-ac349ecc48e8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215350 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215374 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215392 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-service-ca\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215413 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5f41a94-dc5d-4026-983e-52e817217252-audit-dir\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215433 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-trusted-ca\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215455 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c4a374-18fd-4aac-807a-f191398b9490-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215481 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4pg7\" (UniqueName: \"kubernetes.io/projected/288d9110-491b-424c-b001-47835e23220f-kube-api-access-x4pg7\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215516 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-config\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215538 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-config\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215557 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-oauth-config\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215582 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-etcd-serving-ca\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215604 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-encryption-config\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215626 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-console-config\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215652 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-client\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215674 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215701 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c96kl\" (UniqueName: \"kubernetes.io/projected/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-kube-api-access-c96kl\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215722 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31256627-4fcd-49a5-87b1-0e52e6265720-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215753 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6g9\" (UniqueName: \"kubernetes.io/projected/acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564-kube-api-access-vw6g9\") pod \"downloads-7954f5f757-dzwsk\" (UID: \"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564\") " pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215777 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-config\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215801 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-service-ca-bundle\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215822 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-audit\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215859 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c281c741-25bd-4650-a6aa-6e33eaf3d80c-auth-proxy-config\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215880 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7q7t\" (UniqueName: \"kubernetes.io/projected/31256627-4fcd-49a5-87b1-0e52e6265720-kube-api-access-s7q7t\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215902 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-serving-cert\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215946 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f524x\" (UniqueName: \"kubernetes.io/projected/2a2b9241-d006-442c-9133-74ac83793483-kube-api-access-f524x\") pod \"cluster-samples-operator-665b6dd947-2dmjw\" (UID: \"2a2b9241-d006-442c-9133-74ac83793483\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215974 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.215999 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-config\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216025 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-ca\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216059 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-encryption-config\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216084 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32c4a374-18fd-4aac-807a-f191398b9490-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216106 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-config\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216130 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379303a6-d86c-4861-b7d5-b46f2a336fb9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216155 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/42a13b86-8970-4571-9a14-9dea1c55558f-node-pullsecrets\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216185 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216209 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc74s\" (UniqueName: \"kubernetes.io/projected/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-kube-api-access-cc74s\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216235 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-config\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216259 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-serving-cert\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216286 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216311 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4qnv\" (UniqueName: \"kubernetes.io/projected/32c4a374-18fd-4aac-807a-f191398b9490-kube-api-access-f4qnv\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216334 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p96p\" (UniqueName: \"kubernetes.io/projected/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-kube-api-access-7p96p\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216355 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-etcd-client\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216382 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216406 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksggv\" (UniqueName: \"kubernetes.io/projected/214a7b7b-3483-4084-bf89-fffbe6e5d591-kube-api-access-ksggv\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216428 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-serving-cert\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216454 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k828r\" (UniqueName: \"kubernetes.io/projected/2eafc357-18f4-49a8-88be-d7e67ed800a0-kube-api-access-k828r\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216492 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216516 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjh6w\" (UniqueName: \"kubernetes.io/projected/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-kube-api-access-rjh6w\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216542 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-config\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216581 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4krs7\" (UniqueName: \"kubernetes.io/projected/c569b375-f808-4f3d-8f3d-a162677356ff-kube-api-access-4krs7\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216614 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216638 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42a13b86-8970-4571-9a14-9dea1c55558f-audit-dir\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216666 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216691 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216714 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdx85\" (UniqueName: \"kubernetes.io/projected/c5f41a94-dc5d-4026-983e-52e817217252-kube-api-access-cdx85\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216737 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlqn\" (UniqueName: \"kubernetes.io/projected/f1b00244-2d56-4cfc-a852-0ef9c33214e1-kube-api-access-7rlqn\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216755 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5phk7\" (UniqueName: \"kubernetes.io/projected/42a13b86-8970-4571-9a14-9dea1c55558f-kube-api-access-5phk7\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216783 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/178400b6-5c7d-4e98-9884-ac349ecc48e8-config\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216806 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178400b6-5c7d-4e98-9884-ac349ecc48e8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216834 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216858 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-client-ca\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216879 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-audit-policies\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216927 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-trusted-ca-bundle\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.216981 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-service-ca\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217214 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4bv\" (UniqueName: \"kubernetes.io/projected/379303a6-d86c-4861-b7d5-b46f2a336fb9-kube-api-access-nc4bv\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217308 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-image-import-ca\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217335 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpjln\" (UniqueName: \"kubernetes.io/projected/874d0ff7-4923-4423-920a-59e6a632507a-kube-api-access-rpjln\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217364 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/288d9110-491b-424c-b001-47835e23220f-signing-cabundle\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217387 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9715a6-753c-4341-899a-e769836ad4e1-serving-cert\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217411 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-config\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217441 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217465 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/379303a6-d86c-4861-b7d5-b46f2a336fb9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217488 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217515 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.217544 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.218658 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.219015 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/42a13b86-8970-4571-9a14-9dea1c55558f-audit-dir\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.219650 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-available-featuregates\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.235797 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.235867 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.236192 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-config\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.236971 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-audit-policies\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.237007 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.237308 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.239313 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-config\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.239355 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.241091 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/31256627-4fcd-49a5-87b1-0e52e6265720-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.242655 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.243621 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-ca\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.243684 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-config\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.244774 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/42a13b86-8970-4571-9a14-9dea1c55558f-node-pullsecrets\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.245458 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32c4a374-18fd-4aac-807a-f191398b9490-config\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.245880 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-etcd-client\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.245900 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.246274 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.246324 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.247634 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-serving-cert\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.247877 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-images\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.248048 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mmw7q"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.254105 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x95pq"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.254140 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.248969 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-serving-cert\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.249567 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-service-ca\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.249636 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5f41a94-dc5d-4026-983e-52e817217252-audit-dir\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.250444 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-config\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.251317 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-etcd-serving-ca\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.251732 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-oauth-serving-cert\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.252467 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-config\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.253253 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-service-ca-bundle\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.253749 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c281c741-25bd-4650-a6aa-6e33eaf3d80c-auth-proxy-config\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.253770 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-audit\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.248205 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.248555 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c281c741-25bd-4650-a6aa-6e33eaf3d80c-config\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.254738 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f1b00244-2d56-4cfc-a852-0ef9c33214e1-audit-policies\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.255301 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.256012 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-image-import-ca\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.256942 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-console-config\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.257215 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-service-ca\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.258700 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-etcd-client\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.259653 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-client-ca\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.260132 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.260481 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.260773 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42a13b86-8970-4571-9a14-9dea1c55558f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.261085 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-config\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.261391 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-serving-cert\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.262215 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/214a7b7b-3483-4084-bf89-fffbe6e5d591-etcd-client\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.262793 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-trusted-ca-bundle\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.263186 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-oauth-config\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.263243 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-serving-cert\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.264134 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-config\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.265415 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-trusted-ca\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.265579 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.262725 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.266731 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.267195 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2a2b9241-d006-442c-9133-74ac83793483-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2dmjw\" (UID: \"2a2b9241-d006-442c-9133-74ac83793483\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.267509 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.267576 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f1b00244-2d56-4cfc-a852-0ef9c33214e1-audit-dir\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.268276 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-client-ca\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.269751 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/42a13b86-8970-4571-9a14-9dea1c55558f-encryption-config\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.270586 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-encryption-config\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.270588 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/214a7b7b-3483-4084-bf89-fffbe6e5d591-serving-cert\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.270796 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-serving-cert\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.271774 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1b00244-2d56-4cfc-a852-0ef9c33214e1-serving-cert\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.271796 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.272085 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5d71b917-7595-49dd-8c41-e7e642210d35-metrics-tls\") pod \"dns-operator-744455d44c-ghv5z\" (UID: \"5d71b917-7595-49dd-8c41-e7e642210d35\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.273160 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dqr46"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.274539 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/31256627-4fcd-49a5-87b1-0e52e6265720-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.276457 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.276461 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32c4a374-18fd-4aac-807a-f191398b9490-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.276552 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.276594 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.276712 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-serving-cert\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.276544 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.277256 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c569b375-f808-4f3d-8f3d-a162677356ff-serving-cert\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.277586 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.277701 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c281c741-25bd-4650-a6aa-6e33eaf3d80c-machine-approver-tls\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.277800 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.281973 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.284087 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-config\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.284365 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.285589 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.286744 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fxxgv"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.288857 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.289028 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.289199 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-t45k7"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.290221 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-96sfc"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.292100 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7b9p"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.293868 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-djlcz"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.295271 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.296344 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fxxgv"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.297492 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-spmzr"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.298552 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.298615 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dsv9m"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.299258 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.299427 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.299843 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dsv9m"] Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318305 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/178400b6-5c7d-4e98-9884-ac349ecc48e8-config\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318347 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178400b6-5c7d-4e98-9884-ac349ecc48e8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318386 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/288d9110-491b-424c-b001-47835e23220f-signing-cabundle\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318404 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9715a6-753c-4341-899a-e769836ad4e1-serving-cert\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318434 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9715a6-753c-4341-899a-e769836ad4e1-config\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318448 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnvjd\" (UniqueName: \"kubernetes.io/projected/ff9715a6-753c-4341-899a-e769836ad4e1-kube-api-access-xnvjd\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318467 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eafc357-18f4-49a8-88be-d7e67ed800a0-secret-volume\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318482 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/288d9110-491b-424c-b001-47835e23220f-signing-key\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318525 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eafc357-18f4-49a8-88be-d7e67ed800a0-config-volume\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318549 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/178400b6-5c7d-4e98-9884-ac349ecc48e8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318566 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4pg7\" (UniqueName: \"kubernetes.io/projected/288d9110-491b-424c-b001-47835e23220f-kube-api-access-x4pg7\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318727 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.318892 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k828r\" (UniqueName: \"kubernetes.io/projected/2eafc357-18f4-49a8-88be-d7e67ed800a0-kube-api-access-k828r\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.330681 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/379303a6-d86c-4861-b7d5-b46f2a336fb9-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.338723 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.345641 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/379303a6-d86c-4861-b7d5-b46f2a336fb9-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.359372 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.397856 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.418401 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.438276 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.458749 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.462267 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/178400b6-5c7d-4e98-9884-ac349ecc48e8-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.477987 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.479894 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/178400b6-5c7d-4e98-9884-ac349ecc48e8-config\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.507479 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.527835 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.537475 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.558463 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.578795 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.598424 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.617566 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.637583 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.657462 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.677360 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.696871 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.717947 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.738356 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.757217 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.777538 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.797114 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.819627 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.838207 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.856634 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.878841 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.898149 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.917901 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.938292 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.957441 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.978722 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:57:07 crc kubenswrapper[4695]: I0320 10:57:07.997522 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.017750 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.039069 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.058144 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.078162 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.110589 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.115749 4695 request.go:700] Waited for 1.015137423s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.117892 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.138507 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.157730 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.178563 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.197370 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.218321 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.223135 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eafc357-18f4-49a8-88be-d7e67ed800a0-secret-volume\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.237721 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.240632 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eafc357-18f4-49a8-88be-d7e67ed800a0-config-volume\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.258287 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.278274 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.297672 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.317987 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.318821 4695 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.319021 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ff9715a6-753c-4341-899a-e769836ad4e1-config podName:ff9715a6-753c-4341-899a-e769836ad4e1 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.818995965 +0000 UTC m=+206.599601528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ff9715a6-753c-4341-899a-e769836ad4e1-config") pod "service-ca-operator-777779d784-96sfc" (UID: "ff9715a6-753c-4341-899a-e769836ad4e1") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.318823 4695 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.319274 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/288d9110-491b-424c-b001-47835e23220f-signing-cabundle podName:288d9110-491b-424c-b001-47835e23220f nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.819261861 +0000 UTC m=+206.599867424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/288d9110-491b-424c-b001-47835e23220f-signing-cabundle") pod "service-ca-9c57cc56f-djlcz" (UID: "288d9110-491b-424c-b001-47835e23220f") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.318833 4695 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.318863 4695 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.319472 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff9715a6-753c-4341-899a-e769836ad4e1-serving-cert podName:ff9715a6-753c-4341-899a-e769836ad4e1 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.819461357 +0000 UTC m=+206.600066920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ff9715a6-753c-4341-899a-e769836ad4e1-serving-cert") pod "service-ca-operator-777779d784-96sfc" (UID: "ff9715a6-753c-4341-899a-e769836ad4e1") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: E0320 10:57:08.319549 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/288d9110-491b-424c-b001-47835e23220f-signing-key podName:288d9110-491b-424c-b001-47835e23220f nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.819532518 +0000 UTC m=+206.600138091 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/288d9110-491b-424c-b001-47835e23220f-signing-key") pod "service-ca-9c57cc56f-djlcz" (UID: "288d9110-491b-424c-b001-47835e23220f") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.337710 4695 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.357466 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.377989 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.398385 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.418309 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.431326 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.431407 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.443997 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.456848 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.478106 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.498338 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.517288 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.538031 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.558116 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.578636 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.597018 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.618815 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.638762 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.657808 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.677857 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.698540 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.718284 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.738563 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.758209 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.777489 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.837825 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7q7t\" (UniqueName: \"kubernetes.io/projected/31256627-4fcd-49a5-87b1-0e52e6265720-kube-api-access-s7q7t\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.844011 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/288d9110-491b-424c-b001-47835e23220f-signing-cabundle\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.844114 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9715a6-753c-4341-899a-e769836ad4e1-serving-cert\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.844193 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9715a6-753c-4341-899a-e769836ad4e1-config\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.844219 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/288d9110-491b-424c-b001-47835e23220f-signing-key\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.844863 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/288d9110-491b-424c-b001-47835e23220f-signing-cabundle\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.845518 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff9715a6-753c-4341-899a-e769836ad4e1-config\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.848508 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff9715a6-753c-4341-899a-e769836ad4e1-serving-cert\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.849156 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/288d9110-491b-424c-b001-47835e23220f-signing-key\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.856680 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlvpw\" (UniqueName: \"kubernetes.io/projected/5d71b917-7595-49dd-8c41-e7e642210d35-kube-api-access-qlvpw\") pod \"dns-operator-744455d44c-ghv5z\" (UID: \"5d71b917-7595-49dd-8c41-e7e642210d35\") " pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.873476 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/31256627-4fcd-49a5-87b1-0e52e6265720-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-zhb2h\" (UID: \"31256627-4fcd-49a5-87b1-0e52e6265720\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.894006 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4krs7\" (UniqueName: \"kubernetes.io/projected/c569b375-f808-4f3d-8f3d-a162677356ff-kube-api-access-4krs7\") pod \"route-controller-manager-6576b87f9c-8ttcs\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.914139 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pslv7\" (UniqueName: \"kubernetes.io/projected/c281c741-25bd-4650-a6aa-6e33eaf3d80c-kube-api-access-pslv7\") pod \"machine-approver-56656f9798-zlr2v\" (UID: \"c281c741-25bd-4650-a6aa-6e33eaf3d80c\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.933893 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f524x\" (UniqueName: \"kubernetes.io/projected/2a2b9241-d006-442c-9133-74ac83793483-kube-api-access-f524x\") pod \"cluster-samples-operator-665b6dd947-2dmjw\" (UID: \"2a2b9241-d006-442c-9133-74ac83793483\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.951895 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.955177 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4qnv\" (UniqueName: \"kubernetes.io/projected/32c4a374-18fd-4aac-807a-f191398b9490-kube-api-access-f4qnv\") pod \"openshift-apiserver-operator-796bbdcf4f-s62xw\" (UID: \"32c4a374-18fd-4aac-807a-f191398b9490\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.956611 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.974521 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc74s\" (UniqueName: \"kubernetes.io/projected/80f6a0e3-51c3-408f-93ca-c9bf5ed57f34-kube-api-access-cc74s\") pod \"authentication-operator-69f744f599-rgpbm\" (UID: \"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.986174 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" Mar 20 10:57:08 crc kubenswrapper[4695]: I0320 10:57:08.992395 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv2gf\" (UniqueName: \"kubernetes.io/projected/649c40d8-a63f-4361-89b5-15e4ccc2e4dc-kube-api-access-bv2gf\") pod \"openshift-config-operator-7777fb866f-ns9rz\" (UID: \"649c40d8-a63f-4361-89b5-15e4ccc2e4dc\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.012065 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p96p\" (UniqueName: \"kubernetes.io/projected/4290a5ac-ad3d-4b7c-aa6a-8d714de728de-kube-api-access-7p96p\") pod \"console-operator-58897d9998-x95pq\" (UID: \"4290a5ac-ad3d-4b7c-aa6a-8d714de728de\") " pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.017582 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.037030 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjh6w\" (UniqueName: \"kubernetes.io/projected/0cd0f715-9ef8-4e6c-8e56-8e17bd66d882-kube-api-access-rjh6w\") pod \"machine-api-operator-5694c8668f-92bhx\" (UID: \"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.058138 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksggv\" (UniqueName: \"kubernetes.io/projected/214a7b7b-3483-4084-bf89-fffbe6e5d591-kube-api-access-ksggv\") pod \"etcd-operator-b45778765-dqr46\" (UID: \"214a7b7b-3483-4084-bf89-fffbe6e5d591\") " pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.071047 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.080052 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c96kl\" (UniqueName: \"kubernetes.io/projected/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-kube-api-access-c96kl\") pod \"controller-manager-879f6c89f-8f97r\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.088850 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.099262 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.100579 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.105228 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6g9\" (UniqueName: \"kubernetes.io/projected/acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564-kube-api-access-vw6g9\") pod \"downloads-7954f5f757-dzwsk\" (UID: \"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564\") " pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.124490 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlqn\" (UniqueName: \"kubernetes.io/projected/f1b00244-2d56-4cfc-a852-0ef9c33214e1-kube-api-access-7rlqn\") pod \"apiserver-7bbb656c7d-fc6tb\" (UID: \"f1b00244-2d56-4cfc-a852-0ef9c33214e1\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.136391 4695 request.go:700] Waited for 1.882268613s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.139764 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdx85\" (UniqueName: \"kubernetes.io/projected/c5f41a94-dc5d-4026-983e-52e817217252-kube-api-access-cdx85\") pod \"oauth-openshift-558db77b4-r82b4\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.155697 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5phk7\" (UniqueName: \"kubernetes.io/projected/42a13b86-8970-4571-9a14-9dea1c55558f-kube-api-access-5phk7\") pod \"apiserver-76f77b778f-z8vn7\" (UID: \"42a13b86-8970-4571-9a14-9dea1c55558f\") " pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.160820 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.169367 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.184353 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.187206 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4bv\" (UniqueName: \"kubernetes.io/projected/379303a6-d86c-4861-b7d5-b46f2a336fb9-kube-api-access-nc4bv\") pod \"openshift-controller-manager-operator-756b6f6bc6-wqhhq\" (UID: \"379303a6-d86c-4861-b7d5-b46f2a336fb9\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.197471 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw"] Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.200690 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpjln\" (UniqueName: \"kubernetes.io/projected/874d0ff7-4923-4423-920a-59e6a632507a-kube-api-access-rpjln\") pod \"console-f9d7485db-s2xcj\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.204286 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.212790 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.217997 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf246004-3476-4ae3-8a8b-25f2e6d44ec9-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xz9q6\" (UID: \"cf246004-3476-4ae3-8a8b-25f2e6d44ec9\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.218029 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.218447 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-ghv5z"] Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.227144 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.239049 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.239969 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:57:09 crc kubenswrapper[4695]: W0320 10:57:09.260517 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d71b917_7595_49dd_8c41_e7e642210d35.slice/crio-a2ad937422a8b78af06df0df645df5f2ea3d63f08449193783cc09feddd22a7d WatchSource:0}: Error finding container a2ad937422a8b78af06df0df645df5f2ea3d63f08449193783cc09feddd22a7d: Status 404 returned error can't find the container with id a2ad937422a8b78af06df0df645df5f2ea3d63f08449193783cc09feddd22a7d Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.260868 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.268567 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.278075 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.297580 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.301451 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.319611 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.339199 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.361154 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.379449 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.405199 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.406302 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.418378 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.426015 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.436990 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/178400b6-5c7d-4e98-9884-ac349ecc48e8-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vnnmf\" (UID: \"178400b6-5c7d-4e98-9884-ac349ecc48e8\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.442808 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.465462 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4pg7\" (UniqueName: \"kubernetes.io/projected/288d9110-491b-424c-b001-47835e23220f-kube-api-access-x4pg7\") pod \"service-ca-9c57cc56f-djlcz\" (UID: \"288d9110-491b-424c-b001-47835e23220f\") " pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.485787 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k828r\" (UniqueName: \"kubernetes.io/projected/2eafc357-18f4-49a8-88be-d7e67ed800a0-kube-api-access-k828r\") pod \"collect-profiles-29566725-qt6v2\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.499356 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnvjd\" (UniqueName: \"kubernetes.io/projected/ff9715a6-753c-4341-899a-e769836ad4e1-kube-api-access-xnvjd\") pod \"service-ca-operator-777779d784-96sfc\" (UID: \"ff9715a6-753c-4341-899a-e769836ad4e1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.535968 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585249 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585292 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-tls\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585322 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-default-certificate\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585340 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45e62be2-918a-437b-9028-771a6ad9957e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585375 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585389 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/287784c8-36de-41b8-8203-1923a9e1624b-proxy-tls\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585439 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b8cd777d-2168-411d-8232-1162ad5a99d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585456 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8936aa6b-6612-48bc-b512-d2c0d72b730a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xc2wd\" (UID: \"8936aa6b-6612-48bc-b512-d2c0d72b730a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585538 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-profile-collector-cert\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585565 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-stats-auth\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585582 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45e62be2-918a-437b-9028-771a6ad9957e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585629 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlw29\" (UniqueName: \"kubernetes.io/projected/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5-kube-api-access-wlw29\") pod \"auto-csr-approver-29566736-t45k7\" (UID: \"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5\") " pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585657 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq7kt\" (UniqueName: \"kubernetes.io/projected/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-kube-api-access-lq7kt\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585673 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e89e9c-6d73-42f9-98e0-674beb4ffab6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585703 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e89e9c-6d73-42f9-98e0-674beb4ffab6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585722 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7e53738-a7cd-451c-983e-6bc96fabaa27-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585738 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585755 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9nq4\" (UniqueName: \"kubernetes.io/projected/287784c8-36de-41b8-8203-1923a9e1624b-kube-api-access-q9nq4\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585770 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-mountpoint-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585785 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-srv-cert\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585837 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-certificates\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585854 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-srv-cert\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585868 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-bound-sa-token\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.585881 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-socket-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: E0320 10:57:09.587572 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.087543366 +0000 UTC m=+207.868149129 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.590423 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96n4h\" (UniqueName: \"kubernetes.io/projected/b8cd777d-2168-411d-8232-1162ad5a99d2-kube-api-access-96n4h\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.590647 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.590802 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg2j4\" (UniqueName: \"kubernetes.io/projected/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-kube-api-access-hg2j4\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.590885 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvdfd\" (UniqueName: \"kubernetes.io/projected/36b62628-63ee-4520-8787-ce943f478c0b-kube-api-access-dvdfd\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.591209 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-csi-data-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.591671 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.591783 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl9g7\" (UniqueName: \"kubernetes.io/projected/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-kube-api-access-tl9g7\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.591838 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwdg2\" (UniqueName: \"kubernetes.io/projected/761d3c58-fc9e-4428-b633-89b435413025-kube-api-access-rwdg2\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.591951 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crmpj\" (UniqueName: \"kubernetes.io/projected/6c351d34-16d2-4ca1-a9d3-e2be750e636c-kube-api-access-crmpj\") pod \"multus-admission-controller-857f4d67dd-mmw7q\" (UID: \"6c351d34-16d2-4ca1-a9d3-e2be750e636c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.591977 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/95e09f08-a8b1-4873-80a7-cfeab3b3f3b6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkn7q\" (UID: \"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592002 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-plugins-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592045 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c351d34-16d2-4ca1-a9d3-e2be750e636c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mmw7q\" (UID: \"6c351d34-16d2-4ca1-a9d3-e2be750e636c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592113 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58b17661-5628-4e68-aa9e-9d6e850b6dbe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592149 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14b497-1345-4352-9b10-10deaeb1f6ef-service-ca-bundle\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592173 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7e53738-a7cd-451c-983e-6bc96fabaa27-proxy-tls\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592261 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58b17661-5628-4e68-aa9e-9d6e850b6dbe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.592331 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-trusted-ca\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593072 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6thxm\" (UniqueName: \"kubernetes.io/projected/1caa47f1-6d17-46d8-9e6d-7c7501f3621a-kube-api-access-6thxm\") pod \"migrator-59844c95c7-kz6kx\" (UID: \"1caa47f1-6d17-46d8-9e6d-7c7501f3621a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593103 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b8cd777d-2168-411d-8232-1162ad5a99d2-tmpfs\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593126 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e62be2-918a-437b-9028-771a6ad9957e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593277 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz4nl\" (UniqueName: \"kubernetes.io/projected/d7e53738-a7cd-451c-983e-6bc96fabaa27-kube-api-access-wz4nl\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593322 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8ndf\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-kube-api-access-r8ndf\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593349 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593374 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hftj4\" (UniqueName: \"kubernetes.io/projected/8936aa6b-6612-48bc-b512-d2c0d72b730a-kube-api-access-hftj4\") pod \"package-server-manager-789f6589d5-xc2wd\" (UID: \"8936aa6b-6612-48bc-b512-d2c0d72b730a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.593763 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8cd777d-2168-411d-8232-1162ad5a99d2-webhook-cert\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.594365 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-registration-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.594431 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5s9t\" (UniqueName: \"kubernetes.io/projected/9e14b497-1345-4352-9b10-10deaeb1f6ef-kube-api-access-h5s9t\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.594482 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/287784c8-36de-41b8-8203-1923a9e1624b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.594521 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.595108 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-metrics-certs\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.595703 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2vr5\" (UniqueName: \"kubernetes.io/projected/59e89e9c-6d73-42f9-98e0-674beb4ffab6-kube-api-access-s2vr5\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.595786 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncpgb\" (UniqueName: \"kubernetes.io/projected/95e09f08-a8b1-4873-80a7-cfeab3b3f3b6-kube-api-access-ncpgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkn7q\" (UID: \"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.595828 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/287784c8-36de-41b8-8203-1923a9e1624b-images\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.613004 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.615801 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-x95pq"] Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.665302 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.697620 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.697885 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq7kt\" (UniqueName: \"kubernetes.io/projected/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-kube-api-access-lq7kt\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698010 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e89e9c-6d73-42f9-98e0-674beb4ffab6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698038 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e89e9c-6d73-42f9-98e0-674beb4ffab6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698064 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7e53738-a7cd-451c-983e-6bc96fabaa27-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698081 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698114 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9nq4\" (UniqueName: \"kubernetes.io/projected/287784c8-36de-41b8-8203-1923a9e1624b-kube-api-access-q9nq4\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698135 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgj8q\" (UniqueName: \"kubernetes.io/projected/85da7c53-efc5-4a2c-9652-fb3859a8813c-kube-api-access-zgj8q\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698153 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-mountpoint-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698173 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-srv-cert\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698198 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-certificates\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698214 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-srv-cert\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698240 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-bound-sa-token\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698255 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-socket-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698272 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96n4h\" (UniqueName: \"kubernetes.io/projected/b8cd777d-2168-411d-8232-1162ad5a99d2-kube-api-access-96n4h\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698298 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698333 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg2j4\" (UniqueName: \"kubernetes.io/projected/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-kube-api-access-hg2j4\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698352 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/85da7c53-efc5-4a2c-9652-fb3859a8813c-node-bootstrap-token\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698371 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvdfd\" (UniqueName: \"kubernetes.io/projected/36b62628-63ee-4520-8787-ce943f478c0b-kube-api-access-dvdfd\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698397 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-csi-data-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698451 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698488 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76jxr\" (UniqueName: \"kubernetes.io/projected/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-kube-api-access-76jxr\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698505 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl9g7\" (UniqueName: \"kubernetes.io/projected/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-kube-api-access-tl9g7\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698524 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwdg2\" (UniqueName: \"kubernetes.io/projected/761d3c58-fc9e-4428-b633-89b435413025-kube-api-access-rwdg2\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698559 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/95e09f08-a8b1-4873-80a7-cfeab3b3f3b6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkn7q\" (UID: \"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698587 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crmpj\" (UniqueName: \"kubernetes.io/projected/6c351d34-16d2-4ca1-a9d3-e2be750e636c-kube-api-access-crmpj\") pod \"multus-admission-controller-857f4d67dd-mmw7q\" (UID: \"6c351d34-16d2-4ca1-a9d3-e2be750e636c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698604 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-plugins-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698621 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-metrics-tls\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698638 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5nlj\" (UniqueName: \"kubernetes.io/projected/ef890f11-8ab4-4d75-b3a3-14d3957fe746-kube-api-access-h5nlj\") pod \"ingress-canary-dsv9m\" (UID: \"ef890f11-8ab4-4d75-b3a3-14d3957fe746\") " pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698655 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c351d34-16d2-4ca1-a9d3-e2be750e636c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mmw7q\" (UID: \"6c351d34-16d2-4ca1-a9d3-e2be750e636c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698684 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58b17661-5628-4e68-aa9e-9d6e850b6dbe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698711 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14b497-1345-4352-9b10-10deaeb1f6ef-service-ca-bundle\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698768 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58b17661-5628-4e68-aa9e-9d6e850b6dbe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698786 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-trusted-ca\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698801 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7e53738-a7cd-451c-983e-6bc96fabaa27-proxy-tls\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698819 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6thxm\" (UniqueName: \"kubernetes.io/projected/1caa47f1-6d17-46d8-9e6d-7c7501f3621a-kube-api-access-6thxm\") pod \"migrator-59844c95c7-kz6kx\" (UID: \"1caa47f1-6d17-46d8-9e6d-7c7501f3621a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698834 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b8cd777d-2168-411d-8232-1162ad5a99d2-tmpfs\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698858 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz4nl\" (UniqueName: \"kubernetes.io/projected/d7e53738-a7cd-451c-983e-6bc96fabaa27-kube-api-access-wz4nl\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698876 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e62be2-918a-437b-9028-771a6ad9957e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698920 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698938 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hftj4\" (UniqueName: \"kubernetes.io/projected/8936aa6b-6612-48bc-b512-d2c0d72b730a-kube-api-access-hftj4\") pod \"package-server-manager-789f6589d5-xc2wd\" (UID: \"8936aa6b-6612-48bc-b512-d2c0d72b730a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.698956 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8ndf\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-kube-api-access-r8ndf\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699000 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-registration-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699015 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5s9t\" (UniqueName: \"kubernetes.io/projected/9e14b497-1345-4352-9b10-10deaeb1f6ef-kube-api-access-h5s9t\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699030 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8cd777d-2168-411d-8232-1162ad5a99d2-webhook-cert\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699105 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/287784c8-36de-41b8-8203-1923a9e1624b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699128 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699148 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-config-volume\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699174 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-metrics-certs\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699202 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2vr5\" (UniqueName: \"kubernetes.io/projected/59e89e9c-6d73-42f9-98e0-674beb4ffab6-kube-api-access-s2vr5\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699220 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncpgb\" (UniqueName: \"kubernetes.io/projected/95e09f08-a8b1-4873-80a7-cfeab3b3f3b6-kube-api-access-ncpgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkn7q\" (UID: \"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699237 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/287784c8-36de-41b8-8203-1923a9e1624b-images\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699283 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-tls\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699307 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-default-certificate\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699324 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45e62be2-918a-437b-9028-771a6ad9957e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699349 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/85da7c53-efc5-4a2c-9652-fb3859a8813c-certs\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699365 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699381 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/287784c8-36de-41b8-8203-1923a9e1624b-proxy-tls\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699406 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef890f11-8ab4-4d75-b3a3-14d3957fe746-cert\") pod \"ingress-canary-dsv9m\" (UID: \"ef890f11-8ab4-4d75-b3a3-14d3957fe746\") " pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699431 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8936aa6b-6612-48bc-b512-d2c0d72b730a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xc2wd\" (UID: \"8936aa6b-6612-48bc-b512-d2c0d72b730a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699449 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b8cd777d-2168-411d-8232-1162ad5a99d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699481 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-profile-collector-cert\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699514 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-stats-auth\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699531 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45e62be2-918a-437b-9028-771a6ad9957e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.699558 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlw29\" (UniqueName: \"kubernetes.io/projected/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5-kube-api-access-wlw29\") pod \"auto-csr-approver-29566736-t45k7\" (UID: \"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5\") " pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:57:09 crc kubenswrapper[4695]: E0320 10:57:09.699774 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.199758467 +0000 UTC m=+207.980364030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.701696 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59e89e9c-6d73-42f9-98e0-674beb4ffab6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.704420 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-csi-data-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.706198 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58b17661-5628-4e68-aa9e-9d6e850b6dbe-installation-pull-secrets\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.710048 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59e89e9c-6d73-42f9-98e0-674beb4ffab6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.711589 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-certificates\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.712456 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h"] Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.712889 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-socket-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.713764 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d7e53738-a7cd-451c-983e-6bc96fabaa27-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.713926 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-mountpoint-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.714230 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-plugins-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.716251 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-trusted-ca\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.717665 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-trusted-ca\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.741979 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/95e09f08-a8b1-4873-80a7-cfeab3b3f3b6-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkn7q\" (UID: \"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.742661 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58b17661-5628-4e68-aa9e-9d6e850b6dbe-ca-trust-extracted\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.743250 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/287784c8-36de-41b8-8203-1923a9e1624b-proxy-tls\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.746615 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.748364 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-srv-cert\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.749392 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6c351d34-16d2-4ca1-a9d3-e2be750e636c-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-mmw7q\" (UID: \"6c351d34-16d2-4ca1-a9d3-e2be750e636c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.751553 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-srv-cert\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.752415 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d7e53738-a7cd-451c-983e-6bc96fabaa27-proxy-tls\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.753697 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/287784c8-36de-41b8-8203-1923a9e1624b-images\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.754616 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.763839 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9e14b497-1345-4352-9b10-10deaeb1f6ef-service-ca-bundle\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.764154 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b8cd777d-2168-411d-8232-1162ad5a99d2-tmpfs\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.766256 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-default-certificate\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.778182 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/761d3c58-fc9e-4428-b633-89b435413025-registration-dir\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.779089 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/287784c8-36de-41b8-8203-1923a9e1624b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.783514 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-profile-collector-cert\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.785652 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45e62be2-918a-437b-9028-771a6ad9957e-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.796047 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8936aa6b-6612-48bc-b512-d2c0d72b730a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xc2wd\" (UID: \"8936aa6b-6612-48bc-b512-d2c0d72b730a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.796716 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-metrics-certs\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.796757 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b8cd777d-2168-411d-8232-1162ad5a99d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.798965 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-92bhx"] Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.802984 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-metrics-tls\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803027 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5nlj\" (UniqueName: \"kubernetes.io/projected/ef890f11-8ab4-4d75-b3a3-14d3957fe746-kube-api-access-h5nlj\") pod \"ingress-canary-dsv9m\" (UID: \"ef890f11-8ab4-4d75-b3a3-14d3957fe746\") " pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803151 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-config-volume\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803220 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803292 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/85da7c53-efc5-4a2c-9652-fb3859a8813c-certs\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803346 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef890f11-8ab4-4d75-b3a3-14d3957fe746-cert\") pod \"ingress-canary-dsv9m\" (UID: \"ef890f11-8ab4-4d75-b3a3-14d3957fe746\") " pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803477 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgj8q\" (UniqueName: \"kubernetes.io/projected/85da7c53-efc5-4a2c-9652-fb3859a8813c-kube-api-access-zgj8q\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803554 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/85da7c53-efc5-4a2c-9652-fb3859a8813c-node-bootstrap-token\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.803607 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76jxr\" (UniqueName: \"kubernetes.io/projected/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-kube-api-access-76jxr\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.804060 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs"] Mar 20 10:57:09 crc kubenswrapper[4695]: E0320 10:57:09.804503 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.304480064 +0000 UTC m=+208.085085627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.806520 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-config-volume\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.814065 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-profile-collector-cert\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.814257 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/85da7c53-efc5-4a2c-9652-fb3859a8813c-certs\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.814294 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b8cd777d-2168-411d-8232-1162ad5a99d2-webhook-cert\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.822680 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-metrics-tls\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.823863 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/85da7c53-efc5-4a2c-9652-fb3859a8813c-node-bootstrap-token\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.823938 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq7kt\" (UniqueName: \"kubernetes.io/projected/4f2b2bcc-8fa7-4d23-8577-4e09d5485874-kube-api-access-lq7kt\") pod \"catalog-operator-68c6474976-9z5kl\" (UID: \"4f2b2bcc-8fa7-4d23-8577-4e09d5485874\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.824553 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg2j4\" (UniqueName: \"kubernetes.io/projected/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-kube-api-access-hg2j4\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.824832 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45e62be2-918a-437b-9028-771a6ad9957e-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.825156 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ef890f11-8ab4-4d75-b3a3-14d3957fe746-cert\") pod \"ingress-canary-dsv9m\" (UID: \"ef890f11-8ab4-4d75-b3a3-14d3957fe746\") " pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.825492 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-tls\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.826147 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9e14b497-1345-4352-9b10-10deaeb1f6ef-stats-auth\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.826337 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-metrics-tls\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.826743 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvdfd\" (UniqueName: \"kubernetes.io/projected/36b62628-63ee-4520-8787-ce943f478c0b-kube-api-access-dvdfd\") pod \"marketplace-operator-79b997595-w7b9p\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.827271 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.830980 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlw29\" (UniqueName: \"kubernetes.io/projected/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5-kube-api-access-wlw29\") pod \"auto-csr-approver-29566736-t45k7\" (UID: \"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5\") " pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.843579 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7f96s\" (UID: \"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.859453 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-bound-sa-token\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.877070 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl9g7\" (UniqueName: \"kubernetes.io/projected/1fa5c872-2ea1-43b9-b09d-d40de9e50bca-kube-api-access-tl9g7\") pod \"olm-operator-6b444d44fb-qjmq4\" (UID: \"1fa5c872-2ea1-43b9-b09d-d40de9e50bca\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.879307 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.905181 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:09 crc kubenswrapper[4695]: E0320 10:57:09.905389 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.405341863 +0000 UTC m=+208.185947426 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.905466 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:09 crc kubenswrapper[4695]: E0320 10:57:09.906591 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.406581365 +0000 UTC m=+208.187186928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.911274 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwdg2\" (UniqueName: \"kubernetes.io/projected/761d3c58-fc9e-4428-b633-89b435413025-kube-api-access-rwdg2\") pod \"csi-hostpathplugin-9mvwv\" (UID: \"761d3c58-fc9e-4428-b633-89b435413025\") " pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.928268 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.931223 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9nq4\" (UniqueName: \"kubernetes.io/projected/287784c8-36de-41b8-8203-1923a9e1624b-kube-api-access-q9nq4\") pod \"machine-config-operator-74547568cd-qd69b\" (UID: \"287784c8-36de-41b8-8203-1923a9e1624b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.935048 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.937886 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crmpj\" (UniqueName: \"kubernetes.io/projected/6c351d34-16d2-4ca1-a9d3-e2be750e636c-kube-api-access-crmpj\") pod \"multus-admission-controller-857f4d67dd-mmw7q\" (UID: \"6c351d34-16d2-4ca1-a9d3-e2be750e636c\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.953451 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.958630 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2vr5\" (UniqueName: \"kubernetes.io/projected/59e89e9c-6d73-42f9-98e0-674beb4ffab6-kube-api-access-s2vr5\") pod \"kube-storage-version-migrator-operator-b67b599dd-hdzsw\" (UID: \"59e89e9c-6d73-42f9-98e0-674beb4ffab6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:09 crc kubenswrapper[4695]: I0320 10:57:09.975297 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96n4h\" (UniqueName: \"kubernetes.io/projected/b8cd777d-2168-411d-8232-1162ad5a99d2-kube-api-access-96n4h\") pod \"packageserver-d55dfcdfc-zwjht\" (UID: \"b8cd777d-2168-411d-8232-1162ad5a99d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.006028 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz4nl\" (UniqueName: \"kubernetes.io/projected/d7e53738-a7cd-451c-983e-6bc96fabaa27-kube-api-access-wz4nl\") pod \"machine-config-controller-84d6567774-szd4f\" (UID: \"d7e53738-a7cd-451c-983e-6bc96fabaa27\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.007115 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.007392 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" event={"ID":"c569b375-f808-4f3d-8f3d-a162677356ff","Type":"ContainerStarted","Data":"cb5f6c9db323f50f56be48363d3d90938be7ecfb9332a07929bfbecf8ad92ecd"} Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.007691 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.507657889 +0000 UTC m=+208.288263452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.024837 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-x95pq" event={"ID":"4290a5ac-ad3d-4b7c-aa6a-8d714de728de","Type":"ContainerStarted","Data":"108b45c7ab2ce5cfc14f319c11ef6d8f7e14e6acc62e6bd418f991d6901eac5c"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.024887 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-x95pq" event={"ID":"4290a5ac-ad3d-4b7c-aa6a-8d714de728de","Type":"ContainerStarted","Data":"c6e98f69431f743524ac9c2c01ca3c4d18558ba3dcbfc243ed0a11f0184ca5fb"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.027786 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.028675 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" event={"ID":"32c4a374-18fd-4aac-807a-f191398b9490","Type":"ContainerStarted","Data":"e1e25cd26ab296902c56f8b76b497685e9006f7b94e9864060543caeecf8d887"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.028787 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" event={"ID":"32c4a374-18fd-4aac-807a-f191398b9490","Type":"ContainerStarted","Data":"35b38186b516b5823260247e9043b16fe84cef1bbcb398fe041a89565295fd9a"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.031521 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" event={"ID":"31256627-4fcd-49a5-87b1-0e52e6265720","Type":"ContainerStarted","Data":"55e065d3afaba4b0787f4971585d408e761a772c1c5c0253967d04ec75dee6a8"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.033939 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.035436 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncpgb\" (UniqueName: \"kubernetes.io/projected/95e09f08-a8b1-4873-80a7-cfeab3b3f3b6-kube-api-access-ncpgb\") pod \"control-plane-machine-set-operator-78cbb6b69f-lkn7q\" (UID: \"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.041750 4695 patch_prober.go:28] interesting pod/console-operator-58897d9998-x95pq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/readyz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.041822 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-x95pq" podUID="4290a5ac-ad3d-4b7c-aa6a-8d714de728de" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/readyz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.055169 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6thxm\" (UniqueName: \"kubernetes.io/projected/1caa47f1-6d17-46d8-9e6d-7c7501f3621a-kube-api-access-6thxm\") pod \"migrator-59844c95c7-kz6kx\" (UID: \"1caa47f1-6d17-46d8-9e6d-7c7501f3621a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.055524 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" event={"ID":"5d71b917-7595-49dd-8c41-e7e642210d35","Type":"ContainerStarted","Data":"817fa3547618164588dc2b6871937303547b9115e5b0492bccd7e5311e7f8fc1"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.055579 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" event={"ID":"5d71b917-7595-49dd-8c41-e7e642210d35","Type":"ContainerStarted","Data":"a2ad937422a8b78af06df0df645df5f2ea3d63f08449193783cc09feddd22a7d"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.066145 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" event={"ID":"c281c741-25bd-4650-a6aa-6e33eaf3d80c","Type":"ContainerStarted","Data":"72285143ecafd66f49a2537b53dceefd917602f6c0ec484aa223287ba9f51df6"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.066206 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" event={"ID":"c281c741-25bd-4650-a6aa-6e33eaf3d80c","Type":"ContainerStarted","Data":"33da2607dbafa5b8059f7b19ca216cbc1864ae240964a3ca9098a1b67e56bf66"} Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.066880 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/45e62be2-918a-437b-9028-771a6ad9957e-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5m9rp\" (UID: \"45e62be2-918a-437b-9028-771a6ad9957e\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.069951 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8f97r"] Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.075617 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-rgpbm"] Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.076719 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.080507 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8ndf\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-kube-api-access-r8ndf\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.084773 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.093583 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hftj4\" (UniqueName: \"kubernetes.io/projected/8936aa6b-6612-48bc-b512-d2c0d72b730a-kube-api-access-hftj4\") pod \"package-server-manager-789f6589d5-xc2wd\" (UID: \"8936aa6b-6612-48bc-b512-d2c0d72b730a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.097527 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.109060 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.110401 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.610351234 +0000 UTC m=+208.390956797 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.112103 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.118843 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.142804 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5s9t\" (UniqueName: \"kubernetes.io/projected/9e14b497-1345-4352-9b10-10deaeb1f6ef-kube-api-access-h5s9t\") pod \"router-default-5444994796-gnknn\" (UID: \"9e14b497-1345-4352-9b10-10deaeb1f6ef\") " pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.147295 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.163435 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5nlj\" (UniqueName: \"kubernetes.io/projected/ef890f11-8ab4-4d75-b3a3-14d3957fe746-kube-api-access-h5nlj\") pod \"ingress-canary-dsv9m\" (UID: \"ef890f11-8ab4-4d75-b3a3-14d3957fe746\") " pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.172363 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.181094 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76jxr\" (UniqueName: \"kubernetes.io/projected/c8136dfe-4e0c-4b37-a1bd-54162cf7c83a-kube-api-access-76jxr\") pod \"dns-default-fxxgv\" (UID: \"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a\") " pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.189083 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.213501 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.213872 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.713851181 +0000 UTC m=+208.494456744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.225816 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgj8q\" (UniqueName: \"kubernetes.io/projected/85da7c53-efc5-4a2c-9652-fb3859a8813c-kube-api-access-zgj8q\") pod \"machine-config-server-spmzr\" (UID: \"85da7c53-efc5-4a2c-9652-fb3859a8813c\") " pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.272871 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.281144 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-spmzr" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.287706 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dsv9m" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.315390 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.316998 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.816976288 +0000 UTC m=+208.597581851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.368607 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.370589 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.411205 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-dzwsk"] Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.418898 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.419411 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.919369156 +0000 UTC m=+208.699974719 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.427859 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.428759 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.928739347 +0000 UTC m=+208.709344910 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.530025 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.530629 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.030604211 +0000 UTC m=+208.811209774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.638902 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.640396 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.140376799 +0000 UTC m=+208.920982362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: W0320 10:57:10.680798 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc0b6e9_5cb1_4bf5_b66d_abb48a7a1564.slice/crio-a23a8e09db181a21e8d9d1de3d8e2b826a3e6f87a4d608683c7c293752197269 WatchSource:0}: Error finding container a23a8e09db181a21e8d9d1de3d8e2b826a3e6f87a4d608683c7c293752197269: Status 404 returned error can't find the container with id a23a8e09db181a21e8d9d1de3d8e2b826a3e6f87a4d608683c7c293752197269 Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.751586 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.752067 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.252049917 +0000 UTC m=+209.032655480 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.855666 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.856589 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.356545329 +0000 UTC m=+209.137150892 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.899361 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-s62xw" podStartSLOduration=164.899342911 podStartE2EDuration="2m44.899342911s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:10.897314079 +0000 UTC m=+208.677919642" watchObservedRunningTime="2026-03-20 10:57:10.899342911 +0000 UTC m=+208.679948474" Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.957103 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.957716 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.457679754 +0000 UTC m=+209.238285317 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4695]: I0320 10:57:10.957857 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:10 crc kubenswrapper[4695]: E0320 10:57:10.958308 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.45829707 +0000 UTC m=+209.238902623 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.060400 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.061171 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.56114295 +0000 UTC m=+209.341748523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.085137 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" event={"ID":"31256627-4fcd-49a5-87b1-0e52e6265720","Type":"ContainerStarted","Data":"c99caca02f633bd4748506952f64cc0e9aa23a525866078fab00a086469f75d0"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.102168 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-spmzr" event={"ID":"85da7c53-efc5-4a2c-9652-fb3859a8813c","Type":"ContainerStarted","Data":"1d5e16dfff8da8a74bcd356cb8417d5745281204f387c2ea4a34da8751186f98"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.102237 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-spmzr" event={"ID":"85da7c53-efc5-4a2c-9652-fb3859a8813c","Type":"ContainerStarted","Data":"1a7ddfc6a48fcd88696403e243b47b0333706000178c5146bc138e824e570e6f"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.104052 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dzwsk" event={"ID":"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564","Type":"ContainerStarted","Data":"a23a8e09db181a21e8d9d1de3d8e2b826a3e6f87a4d608683c7c293752197269"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.108443 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-x95pq" podStartSLOduration=165.108412938 podStartE2EDuration="2m45.108412938s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.078161638 +0000 UTC m=+208.858767211" watchObservedRunningTime="2026-03-20 10:57:11.108412938 +0000 UTC m=+208.889018501" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.109483 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" event={"ID":"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882","Type":"ContainerStarted","Data":"6f5ff8de84c45c932f213effb9f8c306d8d4da3984cf00a2b4a5ab00d4dbf67e"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.109571 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" event={"ID":"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882","Type":"ContainerStarted","Data":"a16f22acd60fe59459dc44644d4de77ca36d0ca16ab772105967b856bf2e1043"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.109584 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" event={"ID":"0cd0f715-9ef8-4e6c-8e56-8e17bd66d882","Type":"ContainerStarted","Data":"d49dad20d20334528b4d33358ac082d72ad73a55e6370b6f303c1e146d742cba"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.118236 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" event={"ID":"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2","Type":"ContainerStarted","Data":"f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.118307 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" event={"ID":"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2","Type":"ContainerStarted","Data":"fd45267ba657b2b2eb31731b32b8311c757399f57d2bab11928b1ea0a9a03cbb"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.118798 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.120543 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" event={"ID":"c569b375-f808-4f3d-8f3d-a162677356ff","Type":"ContainerStarted","Data":"a247a8936badba2d41e89bf70a0fb81e27b8913627d323fbad080ef85e800894"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.121046 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.124289 4695 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8f97r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.124343 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.126227 4695 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-8ttcs container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.126292 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" podUID="c569b375-f808-4f3d-8f3d-a162677356ff" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.127001 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" event={"ID":"c281c741-25bd-4650-a6aa-6e33eaf3d80c","Type":"ContainerStarted","Data":"0979655d8cfdb60b93b0eda3a29ee0a40d568f49497f4d585edceeab735f2d4d"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.129232 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gnknn" event={"ID":"9e14b497-1345-4352-9b10-10deaeb1f6ef","Type":"ContainerStarted","Data":"d70828fed37ecce3ba51b53a790c9d628f48069b913edb2819ae0a0bc615e521"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.137641 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" event={"ID":"5d71b917-7595-49dd-8c41-e7e642210d35","Type":"ContainerStarted","Data":"74151d0ca6cd0e39bd94a210ecf795f09018b68cb5ecb0bdaf0c43e8ca474872"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.147303 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" event={"ID":"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34","Type":"ContainerStarted","Data":"8ed560dfb3706feb2d8d8d65a352d9dda0bd8c2c7810699ceb6f7187742eff00"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.147375 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" event={"ID":"80f6a0e3-51c3-408f-93ca-c9bf5ed57f34","Type":"ContainerStarted","Data":"813cc22ecf36cc464627bce2212950bb64c93d99562c7689ae755ef49767db3a"} Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.151303 4695 patch_prober.go:28] interesting pod/console-operator-58897d9998-x95pq container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/readyz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.151380 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-x95pq" podUID="4290a5ac-ad3d-4b7c-aa6a-8d714de728de" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/readyz\": dial tcp 10.217.0.33:8443: connect: connection refused" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.162498 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.164017 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.66399897 +0000 UTC m=+209.444604533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.264134 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.264294 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.764258673 +0000 UTC m=+209.544864236 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.265954 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.267375 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.767119796 +0000 UTC m=+209.547725539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.369883 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.370638 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.870614313 +0000 UTC m=+209.651219876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.471607 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.472029 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:11.972009885 +0000 UTC m=+209.752615448 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.524823 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-dqr46"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.573850 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.574964 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.074934127 +0000 UTC m=+209.855539690 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.608294 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.612308 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.612365 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.676353 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.677020 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.176998756 +0000 UTC m=+209.957604319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.688730 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r82b4"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.708464 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-z8vn7"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.757940 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-djlcz"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.779967 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.781626 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.782148 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.282121264 +0000 UTC m=+210.062726827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.793318 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-92bhx" podStartSLOduration=165.793287782 podStartE2EDuration="2m45.793287782s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.780618846 +0000 UTC m=+209.561224409" watchObservedRunningTime="2026-03-20 10:57:11.793287782 +0000 UTC m=+209.573893355" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.795822 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.833880 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.834894 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-spmzr" podStartSLOduration=4.834868813 podStartE2EDuration="4.834868813s" podCreationTimestamp="2026-03-20 10:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.830387958 +0000 UTC m=+209.610993531" watchObservedRunningTime="2026-03-20 10:57:11.834868813 +0000 UTC m=+209.615474376" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.849099 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-s2xcj"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.859100 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-zlr2v" podStartSLOduration=165.859071927 podStartE2EDuration="2m45.859071927s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.85299351 +0000 UTC m=+209.633599073" watchObservedRunningTime="2026-03-20 10:57:11.859071927 +0000 UTC m=+209.639677490" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.883970 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.884514 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.384499092 +0000 UTC m=+210.165104655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.916863 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.932832 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7b9p"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.962189 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" podStartSLOduration=165.962150993 podStartE2EDuration="2m45.962150993s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.901018898 +0000 UTC m=+209.681624461" watchObservedRunningTime="2026-03-20 10:57:11.962150993 +0000 UTC m=+209.742756576" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.986761 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:11 crc kubenswrapper[4695]: E0320 10:57:11.987429 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.487406363 +0000 UTC m=+210.268011926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.989057 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-96sfc"] Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.990713 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" podStartSLOduration=165.990222776 podStartE2EDuration="2m45.990222776s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.933587267 +0000 UTC m=+209.714192830" watchObservedRunningTime="2026-03-20 10:57:11.990222776 +0000 UTC m=+209.770828339" Mar 20 10:57:11 crc kubenswrapper[4695]: I0320 10:57:11.993229 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-zhb2h" podStartSLOduration=165.993213673 podStartE2EDuration="2m45.993213673s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:11.984529449 +0000 UTC m=+209.765135012" watchObservedRunningTime="2026-03-20 10:57:11.993213673 +0000 UTC m=+209.773819236" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.004170 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.063040 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-rgpbm" podStartSLOduration=166.063020151 podStartE2EDuration="2m46.063020151s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:12.061700087 +0000 UTC m=+209.842305650" watchObservedRunningTime="2026-03-20 10:57:12.063020151 +0000 UTC m=+209.843625714" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.063192 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-ghv5z" podStartSLOduration=166.063186215 podStartE2EDuration="2m46.063186215s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:12.013768802 +0000 UTC m=+209.794374365" watchObservedRunningTime="2026-03-20 10:57:12.063186215 +0000 UTC m=+209.843791778" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.082974 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37892: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.089313 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.089798 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.589779061 +0000 UTC m=+210.370384614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.191537 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.191890 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.691874041 +0000 UTC m=+210.472479604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.212841 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" event={"ID":"178400b6-5c7d-4e98-9884-ac349ecc48e8","Type":"ContainerStarted","Data":"dde1d74f08e829cb9b74d284a4f27aa5ed94ff5c43c6249b4ecdba55e479e264"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.252570 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37908: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.256381 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" event={"ID":"42a13b86-8970-4571-9a14-9dea1c55558f","Type":"ContainerStarted","Data":"3ec7bfa5cb12669dec476bc0bd21153f11dbcc0d85f215706d9f4d36ee4ad8ef"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.259844 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-t45k7"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.285026 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" event={"ID":"288d9110-491b-424c-b001-47835e23220f","Type":"ContainerStarted","Data":"68c4166f9a1e6ea28f76457a5ba24cbebfd33672b1defe1c4ac0d78cc4eead6b"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.295403 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.295882 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.79586535 +0000 UTC m=+210.576470903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.333928 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.334941 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" event={"ID":"c5f41a94-dc5d-4026-983e-52e817217252","Type":"ContainerStarted","Data":"920200843c0ace85218d42c99b2d107b41cd8f9b04aebc1e2dde089e6cbc2888"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.346843 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.370797 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.382678 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.396897 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.397525 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:12.897500578 +0000 UTC m=+210.678106141 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.410893 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.413375 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" event={"ID":"f1b00244-2d56-4cfc-a852-0ef9c33214e1","Type":"ContainerStarted","Data":"eca0f9cacaa235fee98522d50fb95daf68985d0d3822f0e62c990f68a715f221"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.413955 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37914: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.433335 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gnknn" event={"ID":"9e14b497-1345-4352-9b10-10deaeb1f6ef","Type":"ContainerStarted","Data":"49d9344ab67776129768b383ad47408fe34f5ee9bc7379d1036485d90cfbbd1f"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.436634 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.438261 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" event={"ID":"649c40d8-a63f-4361-89b5-15e4ccc2e4dc","Type":"ContainerStarted","Data":"b4fa6f7436f0eae8ffd0480d7e1fe0327b070b9603e703dacf43fd1f42a80bf3"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.444014 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.446006 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fxxgv"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.449992 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-mmw7q"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.453559 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.459672 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-9mvwv"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.463171 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dzwsk" event={"ID":"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564","Type":"ContainerStarted","Data":"3d577d53339f614c718f930af9e8ca507db12b381b72ffd58f59c01bf0ccd16e"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.465245 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.467711 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.470098 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" event={"ID":"214a7b7b-3483-4084-bf89-fffbe6e5d591","Type":"ContainerStarted","Data":"9c9bd14401c5ef569696b7d8359ff191468ca8de4c5fa18b6d8bc0edb5bceebf"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.474398 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" event={"ID":"379303a6-d86c-4861-b7d5-b46f2a336fb9","Type":"ContainerStarted","Data":"1c5732a36e560e4552d4a5f628a55cdb797583ac7101f0b1978eb47e92475ff7"} Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.474802 4695 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-8f97r container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.474881 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.476956 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dsv9m"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.483264 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.484448 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gnknn" podStartSLOduration=166.484404557 podStartE2EDuration="2m46.484404557s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:12.455151374 +0000 UTC m=+210.235756937" watchObservedRunningTime="2026-03-20 10:57:12.484404557 +0000 UTC m=+210.265010130" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.484847 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.487268 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.491468 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-dzwsk" podStartSLOduration=166.491447089 podStartE2EDuration="2m46.491447089s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:12.483804962 +0000 UTC m=+210.264410525" watchObservedRunningTime="2026-03-20 10:57:12.491447089 +0000 UTC m=+210.272052652" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.492027 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.492877 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b"] Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.494483 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37930: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.498709 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.500764 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.000740148 +0000 UTC m=+210.781345711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: W0320 10:57:12.583106 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f2b2bcc_8fa7_4d23_8577_4e09d5485874.slice/crio-5ada441e933e05d7acb7b6fe53cf1632c6a20890698b0c7b36657ca1c83496a0 WatchSource:0}: Error finding container 5ada441e933e05d7acb7b6fe53cf1632c6a20890698b0c7b36657ca1c83496a0: Status 404 returned error can't find the container with id 5ada441e933e05d7acb7b6fe53cf1632c6a20890698b0c7b36657ca1c83496a0 Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.589858 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37944: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: W0320 10:57:12.590570 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59e89e9c_6d73_42f9_98e0_674beb4ffab6.slice/crio-139d352090c3cad3e02f22ac6571e6a5974ff4dae17eb3fdc4939f4a8838f424 WatchSource:0}: Error finding container 139d352090c3cad3e02f22ac6571e6a5974ff4dae17eb3fdc4939f4a8838f424: Status 404 returned error can't find the container with id 139d352090c3cad3e02f22ac6571e6a5974ff4dae17eb3fdc4939f4a8838f424 Mar 20 10:57:12 crc kubenswrapper[4695]: W0320 10:57:12.598035 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45e62be2_918a_437b_9028_771a6ad9957e.slice/crio-c112866a53fd4dd5ea54106acc5809b7614ec8ff43d6aa39de4821f56a815ad9 WatchSource:0}: Error finding container c112866a53fd4dd5ea54106acc5809b7614ec8ff43d6aa39de4821f56a815ad9: Status 404 returned error can't find the container with id c112866a53fd4dd5ea54106acc5809b7614ec8ff43d6aa39de4821f56a815ad9 Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.600317 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.600439 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.100412206 +0000 UTC m=+210.881017779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.602322 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.604180 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.104159603 +0000 UTC m=+210.884765166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.700638 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37948: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.709197 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.709964 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.209900607 +0000 UTC m=+210.990506180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.812090 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.812493 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.31248163 +0000 UTC m=+211.093087193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.888949 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37964: no serving certificate available for the kubelet" Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.913588 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.913898 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.413871812 +0000 UTC m=+211.194477375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:12 crc kubenswrapper[4695]: I0320 10:57:12.914287 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:12 crc kubenswrapper[4695]: E0320 10:57:12.914755 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.414739734 +0000 UTC m=+211.195345297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.015756 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.017194 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.517171033 +0000 UTC m=+211.297776596 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.118471 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.118958 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.618936365 +0000 UTC m=+211.399541928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.219311 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.219755 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.719738342 +0000 UTC m=+211.500343905 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.319645 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37978: no serving certificate available for the kubelet" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.321156 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.321586 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.821570385 +0000 UTC m=+211.602175948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.371607 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.379594 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:13 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:13 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:13 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.379703 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.427617 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.428010 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:13.927992216 +0000 UTC m=+211.708597779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.508339 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s2xcj" event={"ID":"874d0ff7-4923-4423-920a-59e6a632507a","Type":"ContainerStarted","Data":"936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.508398 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s2xcj" event={"ID":"874d0ff7-4923-4423-920a-59e6a632507a","Type":"ContainerStarted","Data":"67cf2527879fe5af1d3fe7701553016fb56a0e7ca4aef8c5b97142a746107ce9"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.525793 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" event={"ID":"214a7b7b-3483-4084-bf89-fffbe6e5d591","Type":"ContainerStarted","Data":"2e9a5e2447e7f93ae3ff34cd4acdd860f9ccda62bacbfae7fa7985b9dc205074"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.529819 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.530254 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.03023882 +0000 UTC m=+211.810844383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.557731 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" event={"ID":"1fa5c872-2ea1-43b9-b09d-d40de9e50bca","Type":"ContainerStarted","Data":"6817bc8af15623de8e1680d7c9c3f698defaefe04833f5577831e8ae1d52247c"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.566597 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" event={"ID":"ff9715a6-753c-4341-899a-e769836ad4e1","Type":"ContainerStarted","Data":"afce612cf0a5013eeb134f70593addccb0c83918e91ca18a8f8f0431963d249c"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.566667 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" event={"ID":"ff9715a6-753c-4341-899a-e769836ad4e1","Type":"ContainerStarted","Data":"959e8562391b74f89fa35355fe3460e7358ba033ddf12845308657bc08e11131"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.570133 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" event={"ID":"8936aa6b-6612-48bc-b512-d2c0d72b730a","Type":"ContainerStarted","Data":"cdd866023fc32e2d38dce3d901d3c87b0bec06d19a27b2f9a6a1898c4bcc2861"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.595499 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-t45k7" event={"ID":"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5","Type":"ContainerStarted","Data":"935a52373d357ebf12ed350f2e06ac50e4696b9ad52c70285c0f9c18ce60a225"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.606580 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" event={"ID":"649c40d8-a63f-4361-89b5-15e4ccc2e4dc","Type":"ContainerStarted","Data":"d26e5571e0901292b031aadf817ffb86ff286ff4a3d6989da44c04a1e5774a94"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.623345 4695 generic.go:334] "Generic (PLEG): container finished" podID="42a13b86-8970-4571-9a14-9dea1c55558f" containerID="b54bc46aad422ba17a4c5927b8c29cee2b6892db3eb43f37fd780bfcb0593f26" exitCode=0 Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.623476 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" event={"ID":"42a13b86-8970-4571-9a14-9dea1c55558f","Type":"ContainerDied","Data":"b54bc46aad422ba17a4c5927b8c29cee2b6892db3eb43f37fd780bfcb0593f26"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.647438 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.649017 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.1489951 +0000 UTC m=+211.929600663 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.654338 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" event={"ID":"36b62628-63ee-4520-8787-ce943f478c0b","Type":"ContainerStarted","Data":"ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.654404 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" event={"ID":"36b62628-63ee-4520-8787-ce943f478c0b","Type":"ContainerStarted","Data":"2c113c144aaad3064b2842c18ca024dfec724b41312f98dbddf071aaf5136edf"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.655085 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.656450 4695 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-w7b9p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.656491 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" podUID="36b62628-63ee-4520-8787-ce943f478c0b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.664782 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" event={"ID":"cf246004-3476-4ae3-8a8b-25f2e6d44ec9","Type":"ContainerStarted","Data":"b46dd0a1a7a5969bb91465c2e81fbfc09d85f08b86564da53f3f34bedd01d579"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.664855 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" event={"ID":"cf246004-3476-4ae3-8a8b-25f2e6d44ec9","Type":"ContainerStarted","Data":"8bbda1832f71f5f5e64519a4e7e44ed1bf992e981cccdafe1d2e88b5a06b3b08"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.671547 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" event={"ID":"b8cd777d-2168-411d-8232-1162ad5a99d2","Type":"ContainerStarted","Data":"704d843e71dfe521e650bc3096baad4a1a3aeee3ffcb4720bb2d3e837d9d35e1"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.672102 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" event={"ID":"b8cd777d-2168-411d-8232-1162ad5a99d2","Type":"ContainerStarted","Data":"ff88867870539e790a058151d86aeee5cdfcab5cdaa0f61a1617137e279f0802"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.672124 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.680791 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" event={"ID":"288d9110-491b-424c-b001-47835e23220f","Type":"ContainerStarted","Data":"ee856fd7306a0bd6c186449c037549ec1bbab5187d81bcc30b39b6542e43bbba"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.683245 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" event={"ID":"59e89e9c-6d73-42f9-98e0-674beb4ffab6","Type":"ContainerStarted","Data":"8ffac1439d5242e09cc779e4fcb31a2d4b0ea3d6971bac1eed2793c050804ac9"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.683292 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" event={"ID":"59e89e9c-6d73-42f9-98e0-674beb4ffab6","Type":"ContainerStarted","Data":"139d352090c3cad3e02f22ac6571e6a5974ff4dae17eb3fdc4939f4a8838f424"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.690174 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" event={"ID":"379303a6-d86c-4861-b7d5-b46f2a336fb9","Type":"ContainerStarted","Data":"6d3694ef9fd0e07efff6c9e0e24f74d29a52503724c3cea95fac4d1330cdd3c1"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.692669 4695 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zwjht container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.692756 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" podUID="b8cd777d-2168-411d-8232-1162ad5a99d2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.724749 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" event={"ID":"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6","Type":"ContainerStarted","Data":"468932d17e754f4b493b12ad894873deccd4cb20614bab30f37a32149c0b4504"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.724819 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" event={"ID":"95e09f08-a8b1-4873-80a7-cfeab3b3f3b6","Type":"ContainerStarted","Data":"88027a6868f7c8877a5bf57861437080060770aeb6a6f8bf76adc819fd6b6048"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.739287 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dsv9m" event={"ID":"ef890f11-8ab4-4d75-b3a3-14d3957fe746","Type":"ContainerStarted","Data":"ded66afe99664b00040c64d2dd5d41dd45b063d54f1610e091c1e0bbf81af6f0"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.749153 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.761731 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.261711344 +0000 UTC m=+212.042316907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.766884 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" event={"ID":"4f2b2bcc-8fa7-4d23-8577-4e09d5485874","Type":"ContainerStarted","Data":"8d1ca4d0f067e1c884251c42f4a3ffac20d9a4beee36c9c1983332c4a5a127a7"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.766964 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" event={"ID":"4f2b2bcc-8fa7-4d23-8577-4e09d5485874","Type":"ContainerStarted","Data":"5ada441e933e05d7acb7b6fe53cf1632c6a20890698b0c7b36657ca1c83496a0"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.771130 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.774205 4695 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9z5kl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.774270 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" podUID="4f2b2bcc-8fa7-4d23-8577-4e09d5485874" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.797835 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" event={"ID":"d7e53738-a7cd-451c-983e-6bc96fabaa27","Type":"ContainerStarted","Data":"17d570212fc5679be4e3951e9924ac98dbd883ed2cdb9e50236dc87987a9d6ae"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.797898 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" event={"ID":"d7e53738-a7cd-451c-983e-6bc96fabaa27","Type":"ContainerStarted","Data":"d6a8e960d8f3532b7e6e854d0397f8f78bb0656e2a63eeedd54efa5c024bf0be"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.857016 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.857997 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.357970724 +0000 UTC m=+212.138576287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.872593 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" event={"ID":"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0","Type":"ContainerStarted","Data":"e3b826308c0cb983c9de13f2181c9559da9c66d62e84976777528a4218f249ce"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.884403 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" event={"ID":"6c351d34-16d2-4ca1-a9d3-e2be750e636c","Type":"ContainerStarted","Data":"a6bb2bb566bb1a9c999f8c2afeabd8a5f576c6014b0604d276799f481d75a3a2"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.901770 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" event={"ID":"761d3c58-fc9e-4428-b633-89b435413025","Type":"ContainerStarted","Data":"5a643751ff4284dcb5776e924baca86fcd9d99f226e54e36d91a53afcaea51a0"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.912492 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" podStartSLOduration=167.912452157 podStartE2EDuration="2m47.912452157s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:13.863084615 +0000 UTC m=+211.643690188" watchObservedRunningTime="2026-03-20 10:57:13.912452157 +0000 UTC m=+211.693057720" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.933163 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" event={"ID":"1caa47f1-6d17-46d8-9e6d-7c7501f3621a","Type":"ContainerStarted","Data":"79c598ce4f28d24368cead297d9a5b59894a5fdd521a057b6e1450e1de3151a7"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.938348 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-djlcz" podStartSLOduration=167.938318594 podStartE2EDuration="2m47.938318594s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:13.89469076 +0000 UTC m=+211.675296343" watchObservedRunningTime="2026-03-20 10:57:13.938318594 +0000 UTC m=+211.718924157" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.939216 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-s2xcj" podStartSLOduration=167.939207817 podStartE2EDuration="2m47.939207817s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:13.934425013 +0000 UTC m=+211.715030576" watchObservedRunningTime="2026-03-20 10:57:13.939207817 +0000 UTC m=+211.719813390" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.967606 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:13 crc kubenswrapper[4695]: E0320 10:57:13.968089 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.46807051 +0000 UTC m=+212.248676073 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.970509 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" podStartSLOduration=167.970474892 podStartE2EDuration="2m47.970474892s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:13.968422909 +0000 UTC m=+211.749028492" watchObservedRunningTime="2026-03-20 10:57:13.970474892 +0000 UTC m=+211.751080465" Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.972705 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" event={"ID":"287784c8-36de-41b8-8203-1923a9e1624b","Type":"ContainerStarted","Data":"0c5ab7998372a1bf25e6f17fab7bbb20b8adc66dcbafeb6e9b9e56f8f8a3536f"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.988510 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" event={"ID":"2a2b9241-d006-442c-9133-74ac83793483","Type":"ContainerStarted","Data":"a8139d638735edce4968be8be566d575af2226d72bb73288f122ccde9452ba6e"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.989973 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" event={"ID":"45e62be2-918a-437b-9028-771a6ad9957e","Type":"ContainerStarted","Data":"c112866a53fd4dd5ea54106acc5809b7614ec8ff43d6aa39de4821f56a815ad9"} Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.995443 4695 generic.go:334] "Generic (PLEG): container finished" podID="f1b00244-2d56-4cfc-a852-0ef9c33214e1" containerID="02c43db5407700f6b164c869217b896dc9caa1b9d85fb79daf70716e7e9f0442" exitCode=0 Mar 20 10:57:13 crc kubenswrapper[4695]: I0320 10:57:13.995514 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" event={"ID":"f1b00244-2d56-4cfc-a852-0ef9c33214e1","Type":"ContainerDied","Data":"02c43db5407700f6b164c869217b896dc9caa1b9d85fb79daf70716e7e9f0442"} Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.000167 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" event={"ID":"c5f41a94-dc5d-4026-983e-52e817217252","Type":"ContainerStarted","Data":"537c70a1f9d4c6e8600ac402a58bc5f2eab15e5bd2588dba31a8908ee46319f1"} Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.000896 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.002592 4695 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r82b4 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.002648 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" podUID="c5f41a94-dc5d-4026-983e-52e817217252" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.005935 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" event={"ID":"2eafc357-18f4-49a8-88be-d7e67ed800a0","Type":"ContainerStarted","Data":"91585b1f21b2154949883fb06e2da716982e430c256e2e5f55cf591cbbf69ba1"} Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.006004 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" event={"ID":"2eafc357-18f4-49a8-88be-d7e67ed800a0","Type":"ContainerStarted","Data":"694569f74274f77741198de3b03d3a83a158867349b7b72b7de7fb478ca7e4e4"} Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.008475 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" event={"ID":"178400b6-5c7d-4e98-9884-ac349ecc48e8","Type":"ContainerStarted","Data":"86c98cfd76578fc02380e2c15087ea6c7335e499f453a4ffec421c0828fa44ef"} Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.015750 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxxgv" event={"ID":"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a","Type":"ContainerStarted","Data":"4148c999cba9d5dbc4c1b7c3530d7459269f2bdbb0a7330fbfae88e648bb6e26"} Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.016209 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.016305 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.028244 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37982: no serving certificate available for the kubelet" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.033512 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" podStartSLOduration=168.033484735 podStartE2EDuration="2m48.033484735s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:13.99715524 +0000 UTC m=+211.777760813" watchObservedRunningTime="2026-03-20 10:57:14.033484735 +0000 UTC m=+211.814090298" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.063671 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-wqhhq" podStartSLOduration=168.063641452 podStartE2EDuration="2m48.063641452s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.063361595 +0000 UTC m=+211.843967158" watchObservedRunningTime="2026-03-20 10:57:14.063641452 +0000 UTC m=+211.844247015" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.068875 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.070264 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.570242912 +0000 UTC m=+212.350848475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.111218 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-hdzsw" podStartSLOduration=168.111196598 podStartE2EDuration="2m48.111196598s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.108654382 +0000 UTC m=+211.889259945" watchObservedRunningTime="2026-03-20 10:57:14.111196598 +0000 UTC m=+211.891802161" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.154505 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-dqr46" podStartSLOduration=168.154471913 podStartE2EDuration="2m48.154471913s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.138898561 +0000 UTC m=+211.919504134" watchObservedRunningTime="2026-03-20 10:57:14.154471913 +0000 UTC m=+211.935077476" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.170469 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.174658 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.674642782 +0000 UTC m=+212.455248345 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.183811 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-lkn7q" podStartSLOduration=168.183784758 podStartE2EDuration="2m48.183784758s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.177345392 +0000 UTC m=+211.957950955" watchObservedRunningTime="2026-03-20 10:57:14.183784758 +0000 UTC m=+211.964390321" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.220484 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xz9q6" podStartSLOduration=168.220466823 podStartE2EDuration="2m48.220466823s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.218972725 +0000 UTC m=+211.999578298" watchObservedRunningTime="2026-03-20 10:57:14.220466823 +0000 UTC m=+212.001072406" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.275541 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.276147 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.776127087 +0000 UTC m=+212.556732650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.305226 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vnnmf" podStartSLOduration=168.305192436 podStartE2EDuration="2m48.305192436s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.291240746 +0000 UTC m=+212.071846319" watchObservedRunningTime="2026-03-20 10:57:14.305192436 +0000 UTC m=+212.085797999" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.355436 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" podStartSLOduration=168.355405959 podStartE2EDuration="2m48.355405959s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.353471139 +0000 UTC m=+212.134076722" watchObservedRunningTime="2026-03-20 10:57:14.355405959 +0000 UTC m=+212.136011532" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.382706 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.383303 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.883285577 +0000 UTC m=+212.663891140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.388205 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:14 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:14 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:14 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.388272 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.435287 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" podStartSLOduration=168.435259316 podStartE2EDuration="2m48.435259316s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:14.40319168 +0000 UTC m=+212.183797273" watchObservedRunningTime="2026-03-20 10:57:14.435259316 +0000 UTC m=+212.215864879" Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.490003 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.490235 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.990181811 +0000 UTC m=+212.770787374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.490655 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.491117 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:14.991103065 +0000 UTC m=+212.771708628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.592209 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.592689 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.092646561 +0000 UTC m=+212.873252124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.593081 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.593611 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.093603876 +0000 UTC m=+212.874209429 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.694832 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.695457 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.195428149 +0000 UTC m=+212.976033712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.804148 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.804663 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.304644473 +0000 UTC m=+213.085250036 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.905816 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.906029 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.405980714 +0000 UTC m=+213.186586287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:14 crc kubenswrapper[4695]: I0320 10:57:14.906931 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:14 crc kubenswrapper[4695]: E0320 10:57:14.907377 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.407365289 +0000 UTC m=+213.187970852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.008124 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.008162 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.508137676 +0000 UTC m=+213.288743239 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.013203 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.013771 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.51375688 +0000 UTC m=+213.294362443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.075251 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxxgv" event={"ID":"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a","Type":"ContainerStarted","Data":"0be002585a59b22221c74e3d4a0ee57b72d0c3bdd03f4e5be9095e06b77026d6"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.087961 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dsv9m" event={"ID":"ef890f11-8ab4-4d75-b3a3-14d3957fe746","Type":"ContainerStarted","Data":"3b16eba3cbd4c3f4807bf35c5ff802484ffa37edce574df35633f3a5c6252d3a"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.114460 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.115384 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.615342098 +0000 UTC m=+213.395947661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.126308 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" event={"ID":"45e62be2-918a-437b-9028-771a6ad9957e","Type":"ContainerStarted","Data":"7002aa1adcd4f9c19a252bb8bc581faa6df9d98b5696088e5357b4086405ae20"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.161538 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" event={"ID":"6c351d34-16d2-4ca1-a9d3-e2be750e636c","Type":"ContainerStarted","Data":"1d241c236b2d12ea9af6fa249c18aa65d30d308e3ad5a4dc8d2cecce2c724cac"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.161590 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" event={"ID":"6c351d34-16d2-4ca1-a9d3-e2be750e636c","Type":"ContainerStarted","Data":"26a799f7f6209fe8cd7a04a28ace6585a13ef34c56225101cc75b5cb1ff08ab0"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.163700 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dsv9m" podStartSLOduration=8.163684663 podStartE2EDuration="8.163684663s" podCreationTimestamp="2026-03-20 10:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.159618418 +0000 UTC m=+212.940223981" watchObservedRunningTime="2026-03-20 10:57:15.163684663 +0000 UTC m=+212.944290226" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.169037 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" event={"ID":"1fa5c872-2ea1-43b9-b09d-d40de9e50bca","Type":"ContainerStarted","Data":"01d448581f41a5283ad3371a8a39b49dbb53dfc528c3eee725d02b2afc72e64d"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.169662 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.191128 4695 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qjmq4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.191207 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" podUID="1fa5c872-2ea1-43b9-b09d-d40de9e50bca" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.208887 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" event={"ID":"f1b00244-2d56-4cfc-a852-0ef9c33214e1","Type":"ContainerStarted","Data":"2a7c98ee8b3d03eb68b13faef6881b8fb02215277381749f90269e784c68bf0c"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.217246 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.218266 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.718238998 +0000 UTC m=+213.498844561 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.229600 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" event={"ID":"287784c8-36de-41b8-8203-1923a9e1624b","Type":"ContainerStarted","Data":"6146523d9a259d8b0c4e231ea0531b10508f5ca1489c63ce7f23fc16def5377a"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.229677 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" event={"ID":"287784c8-36de-41b8-8203-1923a9e1624b","Type":"ContainerStarted","Data":"c11601d446307f9ed107ed451ccf53cdf07540e42f73f4d47e47a40bfbe9aeb7"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.255564 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" event={"ID":"8936aa6b-6612-48bc-b512-d2c0d72b730a","Type":"ContainerStarted","Data":"a2e14c786b3502a6080d23d5c93a8cfd63b38719346d6f08ddcd62fa2f3d4451"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.255620 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" event={"ID":"8936aa6b-6612-48bc-b512-d2c0d72b730a","Type":"ContainerStarted","Data":"cb054915e0e63d6fa09345090d9bbc5acecdd075f7f1c500baae9d625d20b7a7"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.256272 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.276869 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5m9rp" podStartSLOduration=169.276845288 podStartE2EDuration="2m49.276845288s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.273444781 +0000 UTC m=+213.054050344" watchObservedRunningTime="2026-03-20 10:57:15.276845288 +0000 UTC m=+213.057450851" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.301254 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" event={"ID":"1caa47f1-6d17-46d8-9e6d-7c7501f3621a","Type":"ContainerStarted","Data":"719f734c74873eacc9587ec168a231734574050d0ca87ba34fc7f4e995a6f090"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.301339 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" event={"ID":"1caa47f1-6d17-46d8-9e6d-7c7501f3621a","Type":"ContainerStarted","Data":"3efc8f47ea57d127c61370a86f0aa28a4c1d769bf1f59602603ea22c05717ccc"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.319818 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.322656 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.822626808 +0000 UTC m=+213.603232371 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.336384 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" event={"ID":"2a2b9241-d006-442c-9133-74ac83793483","Type":"ContainerStarted","Data":"39f319ca057401ed6dc87d6d537a1977cb2184973f89a652b29e652dad9a42e8"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.336464 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" event={"ID":"2a2b9241-d006-442c-9133-74ac83793483","Type":"ContainerStarted","Data":"bbccb2963690d190bae93833f4795fa0ff175e9b8eccc98e4a3d89d715705c2d"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.387819 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:15 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:15 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:15 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.387900 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.397763 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" event={"ID":"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0","Type":"ContainerStarted","Data":"c56a014175f099a4bf66142b8585cedbdb821f8fd1949f083f5ce30d936e9a7e"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.397839 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" event={"ID":"8d6c9aa1-aeff-45f6-be3a-beabcdf7faf0","Type":"ContainerStarted","Data":"01a63864bb4089347a73972bc4b26ffeaacdbacdda4a374334f4092a325c704f"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.422950 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.424109 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:15.924083052 +0000 UTC m=+213.704688615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.427690 4695 ???:1] "http: TLS handshake error from 192.168.126.11:37992: no serving certificate available for the kubelet" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.434652 4695 generic.go:334] "Generic (PLEG): container finished" podID="649c40d8-a63f-4361-89b5-15e4ccc2e4dc" containerID="d26e5571e0901292b031aadf817ffb86ff286ff4a3d6989da44c04a1e5774a94" exitCode=0 Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.434839 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" event={"ID":"649c40d8-a63f-4361-89b5-15e4ccc2e4dc","Type":"ContainerDied","Data":"d26e5571e0901292b031aadf817ffb86ff286ff4a3d6989da44c04a1e5774a94"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.434890 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" event={"ID":"649c40d8-a63f-4361-89b5-15e4ccc2e4dc","Type":"ContainerStarted","Data":"2d65f25d698c31aee4120ae5228706a68fc33edc52092e00226a5331125ff0a6"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.435296 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.479059 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" event={"ID":"d7e53738-a7cd-451c-983e-6bc96fabaa27","Type":"ContainerStarted","Data":"34cd94009ff9bcb09d8c0d4742aabea40a037f6b089c8f3fc59d127a3d3a23d9"} Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.489284 4695 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-r82b4 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.489371 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" podUID="c5f41a94-dc5d-4026-983e-52e817217252" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.491109 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.491141 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.491648 4695 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-w7b9p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" start-of-body= Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.491731 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" podUID="36b62628-63ee-4520-8787-ce943f478c0b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.39:8080/healthz\": dial tcp 10.217.0.39:8080: connect: connection refused" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.492209 4695 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-9z5kl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.492243 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" podUID="4f2b2bcc-8fa7-4d23-8577-4e09d5485874" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.492793 4695 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zwjht container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.492847 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" podUID="b8cd777d-2168-411d-8232-1162ad5a99d2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.530714 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.532797 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.032764262 +0000 UTC m=+213.813369835 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.565451 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-qd69b" podStartSLOduration=169.565421023 podStartE2EDuration="2m49.565421023s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.548004454 +0000 UTC m=+213.328610017" watchObservedRunningTime="2026-03-20 10:57:15.565421023 +0000 UTC m=+213.346026586" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.638107 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.643875 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.143850494 +0000 UTC m=+213.924456197 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.689040 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" podStartSLOduration=169.689003807 podStartE2EDuration="2m49.689003807s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.671808814 +0000 UTC m=+213.452414377" watchObservedRunningTime="2026-03-20 10:57:15.689003807 +0000 UTC m=+213.469609380" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.741200 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.742391 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.242361221 +0000 UTC m=+214.022966784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.760132 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-mmw7q" podStartSLOduration=169.760105649 podStartE2EDuration="2m49.760105649s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.717175072 +0000 UTC m=+213.497780655" watchObservedRunningTime="2026-03-20 10:57:15.760105649 +0000 UTC m=+213.540711212" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.839519 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" podStartSLOduration=169.839495134 podStartE2EDuration="2m49.839495134s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.763130017 +0000 UTC m=+213.543735570" watchObservedRunningTime="2026-03-20 10:57:15.839495134 +0000 UTC m=+213.620100697" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.841410 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" podStartSLOduration=169.841402283 podStartE2EDuration="2m49.841402283s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.838042396 +0000 UTC m=+213.618647959" watchObservedRunningTime="2026-03-20 10:57:15.841402283 +0000 UTC m=+213.622007846" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.843601 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.844129 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.344114123 +0000 UTC m=+214.124719676 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.942446 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-96sfc" podStartSLOduration=169.942418706 podStartE2EDuration="2m49.942418706s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.877858032 +0000 UTC m=+213.658463595" watchObservedRunningTime="2026-03-20 10:57:15.942418706 +0000 UTC m=+213.723024279" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.944225 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2dmjw" podStartSLOduration=169.944211762 podStartE2EDuration="2m49.944211762s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.940702671 +0000 UTC m=+213.721308234" watchObservedRunningTime="2026-03-20 10:57:15.944211762 +0000 UTC m=+213.724817325" Mar 20 10:57:15 crc kubenswrapper[4695]: I0320 10:57:15.946348 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:15 crc kubenswrapper[4695]: E0320 10:57:15.946988 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.446940072 +0000 UTC m=+214.227545645 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.048333 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.048804 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.548788646 +0000 UTC m=+214.329394209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.078629 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7f96s" podStartSLOduration=170.078603814 podStartE2EDuration="2m50.078603814s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:15.998920471 +0000 UTC m=+213.779526034" watchObservedRunningTime="2026-03-20 10:57:16.078603814 +0000 UTC m=+213.859209377" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.126804 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-szd4f" podStartSLOduration=170.126774415 podStartE2EDuration="2m50.126774415s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:16.120647617 +0000 UTC m=+213.901253180" watchObservedRunningTime="2026-03-20 10:57:16.126774415 +0000 UTC m=+213.907379978" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.150057 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.150685 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.65064895 +0000 UTC m=+214.431254583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.161008 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kz6kx" podStartSLOduration=170.160974446 podStartE2EDuration="2m50.160974446s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:16.158182794 +0000 UTC m=+213.938788357" watchObservedRunningTime="2026-03-20 10:57:16.160974446 +0000 UTC m=+213.941580039" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.216158 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" podStartSLOduration=170.216129057 podStartE2EDuration="2m50.216129057s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:16.214182157 +0000 UTC m=+213.994787730" watchObservedRunningTime="2026-03-20 10:57:16.216129057 +0000 UTC m=+213.996734640" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.261411 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.261993 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.761965098 +0000 UTC m=+214.542570661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.363734 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.363860 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.863839343 +0000 UTC m=+214.644444906 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.364174 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.364565 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.864555511 +0000 UTC m=+214.645161064 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.385159 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:16 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:16 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:16 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.385228 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.466007 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.466135 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.966114487 +0000 UTC m=+214.746720050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.466478 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.466791 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:16.966782535 +0000 UTC m=+214.747388098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.527145 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" event={"ID":"42a13b86-8970-4571-9a14-9dea1c55558f","Type":"ContainerStarted","Data":"eec5a17c34e0dc186d015ac24e3c61a2d72d1532420df5ef9f51d8dcf0519dff"} Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.533266 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" event={"ID":"761d3c58-fc9e-4428-b633-89b435413025","Type":"ContainerStarted","Data":"f519efb48893324a04bc196029b1576af6dc4410daded2eed7bed88376e3c0c7"} Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.558484 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fxxgv" event={"ID":"c8136dfe-4e0c-4b37-a1bd-54162cf7c83a","Type":"ContainerStarted","Data":"1108869effc94547391f2626cb8eb869e576975cbc40ad7c412fb398ef16b303"} Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.561569 4695 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-qjmq4 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.561637 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" podUID="1fa5c872-2ea1-43b9-b09d-d40de9e50bca" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.567513 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.568109 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.068085195 +0000 UTC m=+214.848690758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.572827 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-9z5kl" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.637210 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fxxgv" podStartSLOduration=9.637182155 podStartE2EDuration="9.637182155s" podCreationTimestamp="2026-03-20 10:57:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:16.592892394 +0000 UTC m=+214.373497957" watchObservedRunningTime="2026-03-20 10:57:16.637182155 +0000 UTC m=+214.417787718" Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.671759 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.675237 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.175212395 +0000 UTC m=+214.955818148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.775099 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.775527 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.275503788 +0000 UTC m=+215.056109351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.877163 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.877728 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.377704651 +0000 UTC m=+215.158310214 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.978578 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.978800 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.478762644 +0000 UTC m=+215.259368217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:16 crc kubenswrapper[4695]: I0320 10:57:16.979002 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:16 crc kubenswrapper[4695]: E0320 10:57:16.979504 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.479488613 +0000 UTC m=+215.260094176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.090513 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.092371 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.592306169 +0000 UTC m=+215.372911742 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.196939 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.197494 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.697425437 +0000 UTC m=+215.478031000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.298376 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.298680 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.798663816 +0000 UTC m=+215.579269379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.326860 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.376950 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:17 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:17 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:17 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.377026 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.400047 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.400622 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:17.900599582 +0000 UTC m=+215.681205145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.501491 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.501817 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.001801689 +0000 UTC m=+215.782407252 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.572927 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" event={"ID":"42a13b86-8970-4571-9a14-9dea1c55558f","Type":"ContainerStarted","Data":"f57e16677db553ecb1b28ac10c184e678410e3ec7b085094ff0befac6356dfe9"} Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.573780 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.581745 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-qjmq4" Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.605093 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.605635 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.105617914 +0000 UTC m=+215.886223477 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.685999 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" podStartSLOduration=171.685970494 podStartE2EDuration="2m51.685970494s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:17.630809053 +0000 UTC m=+215.411414616" watchObservedRunningTime="2026-03-20 10:57:17.685970494 +0000 UTC m=+215.466576067" Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.706561 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.708814 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.208794552 +0000 UTC m=+215.989400115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.809319 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.810282 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.310259696 +0000 UTC m=+216.090865259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:17 crc kubenswrapper[4695]: I0320 10:57:17.911373 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:17 crc kubenswrapper[4695]: E0320 10:57:17.911811 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.411787992 +0000 UTC m=+216.192393555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.013878 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.014440 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.514406595 +0000 UTC m=+216.295012348 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.115706 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.115973 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.615945501 +0000 UTC m=+216.396551064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.116191 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.116894 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.616875845 +0000 UTC m=+216.397481408 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.137180 4695 ???:1] "http: TLS handshake error from 192.168.126.11:56486: no serving certificate available for the kubelet" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.217654 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.218212 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.718184525 +0000 UTC m=+216.498790098 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.218457 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.219204 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.719192381 +0000 UTC m=+216.499797944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.320084 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.320252 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.820225254 +0000 UTC m=+216.600830817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.320522 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.320930 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.820918712 +0000 UTC m=+216.601524275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.369865 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mzrcb"] Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.372582 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.378149 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:18 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:18 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:18 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.378268 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.385950 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.392431 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mzrcb"] Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.421528 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.422231 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:18.922201561 +0000 UTC m=+216.702807114 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.526225 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.526319 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-utilities\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.526589 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-catalog-content\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.526678 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6786\" (UniqueName: \"kubernetes.io/projected/e96190cb-8d03-4cb7-b3f6-6b46a141f969-kube-api-access-w6786\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.527365 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.02734132 +0000 UTC m=+216.807947083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.628322 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.128290531 +0000 UTC m=+216.908896094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.628183 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.628880 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-catalog-content\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.629479 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6786\" (UniqueName: \"kubernetes.io/projected/e96190cb-8d03-4cb7-b3f6-6b46a141f969-kube-api-access-w6786\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.629414 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-catalog-content\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.630160 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.630562 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.130551459 +0000 UTC m=+216.911157022 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.630804 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-utilities\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.631174 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-utilities\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.713279 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6786\" (UniqueName: \"kubernetes.io/projected/e96190cb-8d03-4cb7-b3f6-6b46a141f969-kube-api-access-w6786\") pod \"community-operators-mzrcb\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.736165 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.736336 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.236294364 +0000 UTC m=+217.016899927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.736503 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.739232 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.239200308 +0000 UTC m=+217.019805871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.785401 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6fqn2"] Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.787407 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.811857 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6fqn2"] Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.840849 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.841067 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.341040742 +0000 UTC m=+217.121646305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.841382 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.841761 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.34175363 +0000 UTC m=+217.122359193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.943332 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.943616 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-catalog-content\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.943714 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lppf4\" (UniqueName: \"kubernetes.io/projected/33f393cc-11cf-4c7a-aeac-8423998e5dc6-kube-api-access-lppf4\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.943755 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-utilities\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:18 crc kubenswrapper[4695]: E0320 10:57:18.943951 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.443896862 +0000 UTC m=+217.224502415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.951560 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-28xd8"] Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.952879 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:18 crc kubenswrapper[4695]: I0320 10:57:18.962451 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.001461 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.005242 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28xd8"] Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.031231 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-x95pq" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045061 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-catalog-content\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045113 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lppf4\" (UniqueName: \"kubernetes.io/projected/33f393cc-11cf-4c7a-aeac-8423998e5dc6-kube-api-access-lppf4\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045160 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-utilities\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045189 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-utilities\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045267 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045314 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-catalog-content\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.045346 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcph\" (UniqueName: \"kubernetes.io/projected/da83bf65-5995-41cf-8f79-98a77e0ace2e-kube-api-access-rxcph\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.046189 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-utilities\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.046258 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.546234579 +0000 UTC m=+217.326840142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.046495 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-catalog-content\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.104821 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lppf4\" (UniqueName: \"kubernetes.io/projected/33f393cc-11cf-4c7a-aeac-8423998e5dc6-kube-api-access-lppf4\") pod \"community-operators-6fqn2\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.127946 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.147716 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.147978 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcph\" (UniqueName: \"kubernetes.io/projected/da83bf65-5995-41cf-8f79-98a77e0ace2e-kube-api-access-rxcph\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.148030 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-catalog-content\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.148079 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-utilities\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.148528 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-utilities\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.148597 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.648582205 +0000 UTC m=+217.429187768 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.161471 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-catalog-content\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.181849 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.204355 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shq4g"] Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.207727 4695 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-ns9rz container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.223127 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" podUID="649c40d8-a63f-4361-89b5-15e4ccc2e4dc" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.224043 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-ns9rz" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.225391 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.230688 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.230749 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.230823 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.230839 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.249742 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.249991 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.250053 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.250215 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.750200443 +0000 UTC m=+217.530806006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.268300 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shq4g"] Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.274673 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcph\" (UniqueName: \"kubernetes.io/projected/da83bf65-5995-41cf-8f79-98a77e0ace2e-kube-api-access-rxcph\") pod \"certified-operators-28xd8\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.282859 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.305895 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.306409 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.337287 4695 patch_prober.go:28] interesting pod/apiserver-76f77b778f-z8vn7 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.337358 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" podUID="42a13b86-8970-4571-9a14-9dea1c55558f" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.14:8443/livez\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.350587 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.351117 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c4mc\" (UniqueName: \"kubernetes.io/projected/eb233657-545c-4a0b-93a0-b29148b5cb3f-kube-api-access-5c4mc\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.351265 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-catalog-content\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.351376 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-utilities\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.353019 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.852991981 +0000 UTC m=+217.633597544 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.354886 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.380869 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:19 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:19 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:19 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.380934 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.410399 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.410461 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.433250 4695 patch_prober.go:28] interesting pod/console-f9d7485db-s2xcj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.433344 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-s2xcj" podUID="874d0ff7-4923-4423-920a-59e6a632507a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.454627 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.455684 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.457038 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-utilities\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.457112 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.457219 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c4mc\" (UniqueName: \"kubernetes.io/projected/eb233657-545c-4a0b-93a0-b29148b5cb3f-kube-api-access-5c4mc\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.457274 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-catalog-content\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.457809 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-catalog-content\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.458121 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-utilities\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.458523 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:19.95850392 +0000 UTC m=+217.739109483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.519460 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.520145 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.526455 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.560956 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.561355 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cee76a1-0ffe-4487-817c-725f160b4406-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.561686 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cee76a1-0ffe-4487-817c-725f160b4406-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.563469 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.063440093 +0000 UTC m=+217.844045656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.601395 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" event={"ID":"761d3c58-fc9e-4428-b633-89b435413025","Type":"ContainerStarted","Data":"14d70f21a12b225e903744c79c894db6903c3de66e93db56f7ba3b8f96075414"} Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.618446 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c4mc\" (UniqueName: \"kubernetes.io/projected/eb233657-545c-4a0b-93a0-b29148b5cb3f-kube-api-access-5c4mc\") pod \"certified-operators-shq4g\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.655469 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fc6tb" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.703894 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cee76a1-0ffe-4487-817c-725f160b4406-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.704524 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.704828 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cee76a1-0ffe-4487-817c-725f160b4406-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.708219 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.208192243 +0000 UTC m=+217.988797966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.708304 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cee76a1-0ffe-4487-817c-725f160b4406-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.803728 4695 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.807434 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.808493 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.308443865 +0000 UTC m=+218.089049568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.819221 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cee76a1-0ffe-4487-817c-725f160b4406-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.890269 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.900351 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 10:57:19 crc kubenswrapper[4695]: I0320 10:57:19.914169 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:19 crc kubenswrapper[4695]: E0320 10:57:19.914613 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.41459682 +0000 UTC m=+218.195202383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.020805 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:20 crc kubenswrapper[4695]: E0320 10:57:20.022408 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.522380737 +0000 UTC m=+218.302986300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.108464 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.122110 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:20 crc kubenswrapper[4695]: E0320 10:57:20.122508 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.622494806 +0000 UTC m=+218.403100369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-h8rbk" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.129263 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.130291 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.143819 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.147250 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.152406 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.162496 4695 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T10:57:19.804165585Z","Handler":null,"Name":""} Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.217413 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zwjht" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.223592 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:20 crc kubenswrapper[4695]: E0320 10:57:20.224070 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:20.724052123 +0000 UTC m=+218.504657686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.233207 4695 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.233262 4695 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.331848 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27a7790b-af8c-474c-8486-60178b563a04-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.331899 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.331978 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27a7790b-af8c-474c-8486-60178b563a04-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.359080 4695 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.359131 4695 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.372571 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.404031 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8f97r"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.410900 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerName="controller-manager" containerID="cri-o://f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11" gracePeriod=30 Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.415771 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:20 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:20 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:20 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.415890 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.435275 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27a7790b-af8c-474c-8486-60178b563a04-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.436299 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27a7790b-af8c-474c-8486-60178b563a04-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.438235 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27a7790b-af8c-474c-8486-60178b563a04-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.459618 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.465983 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" podUID="c569b375-f808-4f3d-8f3d-a162677356ff" containerName="route-controller-manager" containerID="cri-o://a247a8936badba2d41e89bf70a0fb81e27b8913627d323fbad080ef85e800894" gracePeriod=30 Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.492641 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27a7790b-af8c-474c-8486-60178b563a04-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.569640 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-p78dm"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.571043 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.592506 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.609673 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p78dm"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.632505 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-h8rbk\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.651358 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-utilities\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.651471 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stks6\" (UniqueName: \"kubernetes.io/projected/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-kube-api-access-stks6\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.651548 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-catalog-content\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.686809 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" event={"ID":"761d3c58-fc9e-4428-b633-89b435413025","Type":"ContainerStarted","Data":"c9f9df7ee132305075c4c9613a356ee48983ebe8febe14aa8d8473bc2ab0c5bd"} Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.757859 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.758748 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stks6\" (UniqueName: \"kubernetes.io/projected/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-kube-api-access-stks6\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.758885 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-catalog-content\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.759144 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-utilities\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.761849 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-catalog-content\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.762457 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-utilities\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.789436 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6fqn2"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.790469 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.790800 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-28xd8"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.837059 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mzrcb"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.860473 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stks6\" (UniqueName: \"kubernetes.io/projected/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-kube-api-access-stks6\") pod \"redhat-marketplace-p78dm\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.902743 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.918752 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 10:57:20 crc kubenswrapper[4695]: W0320 10:57:20.956721 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode96190cb_8d03_4cb7_b3f6_6b46a141f969.slice/crio-33c1853559c370a3b02b5a1671bc00d072e82919a538828219fd73f2b0b08b6a WatchSource:0}: Error finding container 33c1853559c370a3b02b5a1671bc00d072e82919a538828219fd73f2b0b08b6a: Status 404 returned error can't find the container with id 33c1853559c370a3b02b5a1671bc00d072e82919a538828219fd73f2b0b08b6a Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.963615 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.964558 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6mw7h"] Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.967786 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.983665 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:57:20 crc kubenswrapper[4695]: I0320 10:57:20.998736 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mw7h"] Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.047928 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shq4g"] Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.076675 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-catalog-content\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.076946 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-utilities\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.077069 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5h4f\" (UniqueName: \"kubernetes.io/projected/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-kube-api-access-w5h4f\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.181582 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-utilities\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.181661 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5h4f\" (UniqueName: \"kubernetes.io/projected/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-kube-api-access-w5h4f\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.181744 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-catalog-content\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.182315 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-catalog-content\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.183752 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-utilities\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.242413 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5h4f\" (UniqueName: \"kubernetes.io/projected/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-kube-api-access-w5h4f\") pod \"redhat-marketplace-6mw7h\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.319112 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.444073 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:21 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:21 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:21 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.445508 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.483212 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.597672 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-client-ca\") pod \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.597793 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-proxy-ca-bundles\") pod \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.597862 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c96kl\" (UniqueName: \"kubernetes.io/projected/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-kube-api-access-c96kl\") pod \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.598059 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-config\") pod \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.598084 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-serving-cert\") pod \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\" (UID: \"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2\") " Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.601398 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" (UID: "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.601807 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" (UID: "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.603162 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-config" (OuterVolumeSpecName: "config") pod "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" (UID: "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.622754 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-kube-api-access-c96kl" (OuterVolumeSpecName: "kube-api-access-c96kl") pod "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" (UID: "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2"). InnerVolumeSpecName "kube-api-access-c96kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.623428 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" (UID: "0cb76749-e1ee-42c1-bcef-cb2ca1d793d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.669104 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.707248 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c96kl\" (UniqueName: \"kubernetes.io/projected/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-kube-api-access-c96kl\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.707287 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.707300 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.707311 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.707320 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.712766 4695 generic.go:334] "Generic (PLEG): container finished" podID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerID="8488c576337e96c6eb84aa8b7ad6744195af2acdba5c5fc36f1f8ec1a83c658b" exitCode=0 Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.712971 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerDied","Data":"8488c576337e96c6eb84aa8b7ad6744195af2acdba5c5fc36f1f8ec1a83c658b"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.713012 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerStarted","Data":"10111797dfa1f248c9c68d890d951f4866214f29b6bce291b2c0deff39fbaba8"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.724393 4695 generic.go:334] "Generic (PLEG): container finished" podID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerID="f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11" exitCode=0 Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.724506 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" event={"ID":"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2","Type":"ContainerDied","Data":"f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.724557 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" event={"ID":"0cb76749-e1ee-42c1-bcef-cb2ca1d793d2","Type":"ContainerDied","Data":"fd45267ba657b2b2eb31731b32b8311c757399f57d2bab11928b1ea0a9a03cbb"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.724582 4695 scope.go:117] "RemoveContainer" containerID="f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.725850 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-8f97r" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.754645 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" event={"ID":"761d3c58-fc9e-4428-b633-89b435413025","Type":"ContainerStarted","Data":"9b5691883b2be0b1092dc8816f016654d2cf6ac975f2408c2c7b27ae3f162ecb"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.766435 4695 generic.go:334] "Generic (PLEG): container finished" podID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerID="30067ec28f761750c474c359bf8e0ad6d8a2878f533a1534fdf10cb2d51357a0" exitCode=0 Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.766526 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerDied","Data":"30067ec28f761750c474c359bf8e0ad6d8a2878f533a1534fdf10cb2d51357a0"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.766570 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerStarted","Data":"33c1853559c370a3b02b5a1671bc00d072e82919a538828219fd73f2b0b08b6a"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.790286 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerStarted","Data":"9c4785c8b59b6752da47b7ed3c9a6793c11f1117df037f729145ea5097a1b990"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.790695 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerStarted","Data":"4083af247e308dd9d23e806a531aafdedae5a587b6446114c4f3b8d0f80a0831"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.790711 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-9mvwv" podStartSLOduration=15.790680553 podStartE2EDuration="15.790680553s" podCreationTimestamp="2026-03-20 10:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:21.788178828 +0000 UTC m=+219.568784391" watchObservedRunningTime="2026-03-20 10:57:21.790680553 +0000 UTC m=+219.571286116" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.804001 4695 generic.go:334] "Generic (PLEG): container finished" podID="c569b375-f808-4f3d-8f3d-a162677356ff" containerID="a247a8936badba2d41e89bf70a0fb81e27b8913627d323fbad080ef85e800894" exitCode=0 Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.804097 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" event={"ID":"c569b375-f808-4f3d-8f3d-a162677356ff","Type":"ContainerDied","Data":"a247a8936badba2d41e89bf70a0fb81e27b8913627d323fbad080ef85e800894"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.804138 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" event={"ID":"c569b375-f808-4f3d-8f3d-a162677356ff","Type":"ContainerDied","Data":"cb5f6c9db323f50f56be48363d3d90938be7ecfb9332a07929bfbecf8ad92ecd"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.804149 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb5f6c9db323f50f56be48363d3d90938be7ecfb9332a07929bfbecf8ad92ecd" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.828398 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerStarted","Data":"b07be4cac708dfa83396768af9a3ab98c1fc9719042388a003f699e72f5f187b"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.828475 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerStarted","Data":"7fb86a4ba9ddf2f8a7f072339b16e593c02e09aa56f102993852eff5d2b93840"} Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.866397 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h8rbk"] Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.926654 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kt9vc"] Mar 20 10:57:21 crc kubenswrapper[4695]: E0320 10:57:21.928293 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerName="controller-manager" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.928330 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerName="controller-manager" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.928479 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" containerName="controller-manager" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.929504 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.936811 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.965248 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kt9vc"] Mar 20 10:57:21 crc kubenswrapper[4695]: I0320 10:57:21.986205 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.035089 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-client-ca\") pod \"c569b375-f808-4f3d-8f3d-a162677356ff\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.036520 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-config\") pod \"c569b375-f808-4f3d-8f3d-a162677356ff\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.036780 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4krs7\" (UniqueName: \"kubernetes.io/projected/c569b375-f808-4f3d-8f3d-a162677356ff-kube-api-access-4krs7\") pod \"c569b375-f808-4f3d-8f3d-a162677356ff\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.039140 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c569b375-f808-4f3d-8f3d-a162677356ff-serving-cert\") pod \"c569b375-f808-4f3d-8f3d-a162677356ff\" (UID: \"c569b375-f808-4f3d-8f3d-a162677356ff\") " Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.039664 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wl4\" (UniqueName: \"kubernetes.io/projected/53830966-0b62-40fe-9f81-c18c95ea50aa-kube-api-access-g4wl4\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.039820 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-utilities\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.039949 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-catalog-content\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.044158 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-client-ca" (OuterVolumeSpecName: "client-ca") pod "c569b375-f808-4f3d-8f3d-a162677356ff" (UID: "c569b375-f808-4f3d-8f3d-a162677356ff"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.048801 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-config" (OuterVolumeSpecName: "config") pod "c569b375-f808-4f3d-8f3d-a162677356ff" (UID: "c569b375-f808-4f3d-8f3d-a162677356ff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.049397 4695 scope.go:117] "RemoveContainer" containerID="f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.056217 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8f97r"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.072256 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c569b375-f808-4f3d-8f3d-a162677356ff-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c569b375-f808-4f3d-8f3d-a162677356ff" (UID: "c569b375-f808-4f3d-8f3d-a162677356ff"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.078181 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-8f97r"] Mar 20 10:57:22 crc kubenswrapper[4695]: E0320 10:57:22.086502 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11\": container with ID starting with f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11 not found: ID does not exist" containerID="f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.086575 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11"} err="failed to get container status \"f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11\": rpc error: code = NotFound desc = could not find container \"f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11\": container with ID starting with f3d6e2ece161fe685ec30dca77bc47a0c75a671f8d315d13c544b45bd3f27b11 not found: ID does not exist" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.087022 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-77ffd8674f-7mcv2"] Mar 20 10:57:22 crc kubenswrapper[4695]: E0320 10:57:22.087781 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c569b375-f808-4f3d-8f3d-a162677356ff" containerName="route-controller-manager" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.087802 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="c569b375-f808-4f3d-8f3d-a162677356ff" containerName="route-controller-manager" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.088113 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="c569b375-f808-4f3d-8f3d-a162677356ff" containerName="route-controller-manager" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.088843 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.099287 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c569b375-f808-4f3d-8f3d-a162677356ff-kube-api-access-4krs7" (OuterVolumeSpecName: "kube-api-access-4krs7") pod "c569b375-f808-4f3d-8f3d-a162677356ff" (UID: "c569b375-f808-4f3d-8f3d-a162677356ff"). InnerVolumeSpecName "kube-api-access-4krs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.119688 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.126410 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.126746 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.128331 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.133237 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.133348 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.133508 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.138266 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142457 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-proxy-ca-bundles\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142555 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-utilities\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142603 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-config\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142641 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw859\" (UniqueName: \"kubernetes.io/projected/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-kube-api-access-dw859\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142668 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-serving-cert\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142695 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-catalog-content\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142740 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-client-ca\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142792 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wl4\" (UniqueName: \"kubernetes.io/projected/53830966-0b62-40fe-9f81-c18c95ea50aa-kube-api-access-g4wl4\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142849 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142862 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c569b375-f808-4f3d-8f3d-a162677356ff-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142874 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4krs7\" (UniqueName: \"kubernetes.io/projected/c569b375-f808-4f3d-8f3d-a162677356ff-kube-api-access-4krs7\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.142885 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c569b375-f808-4f3d-8f3d-a162677356ff-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.146552 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-catalog-content\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.150023 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-utilities\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.150147 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77ffd8674f-7mcv2"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.185082 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wl4\" (UniqueName: \"kubernetes.io/projected/53830966-0b62-40fe-9f81-c18c95ea50aa-kube-api-access-g4wl4\") pod \"redhat-operators-kt9vc\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.190600 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-p78dm"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.255687 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-proxy-ca-bundles\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.255753 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-config\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.255788 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw859\" (UniqueName: \"kubernetes.io/projected/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-kube-api-access-dw859\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.255810 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-serving-cert\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.255841 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-client-ca\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.257066 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-client-ca\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.257419 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-config\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.258757 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-proxy-ca-bundles\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.263180 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-serving-cert\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.282689 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw859\" (UniqueName: \"kubernetes.io/projected/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-kube-api-access-dw859\") pod \"controller-manager-77ffd8674f-7mcv2\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.321703 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jnfk6"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.323388 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.334445 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.341735 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnfk6"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.357092 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-catalog-content\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.357194 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw9nb\" (UniqueName: \"kubernetes.io/projected/2a6824e3-65ec-404c-ac28-59fce8d50d83-kube-api-access-hw9nb\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.357278 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-utilities\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.384335 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mw7h"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.387313 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:22 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:22 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:22 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.387391 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:22 crc kubenswrapper[4695]: W0320 10:57:22.406437 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2ea4f1f_16e3_4ac7_ac16_f782b94669ff.slice/crio-91e103e42a0e829f6be656cfc924e846a4d7d3968b5da5158b6de0d329fa72ba WatchSource:0}: Error finding container 91e103e42a0e829f6be656cfc924e846a4d7d3968b5da5158b6de0d329fa72ba: Status 404 returned error can't find the container with id 91e103e42a0e829f6be656cfc924e846a4d7d3968b5da5158b6de0d329fa72ba Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.459589 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw9nb\" (UniqueName: \"kubernetes.io/projected/2a6824e3-65ec-404c-ac28-59fce8d50d83-kube-api-access-hw9nb\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.459755 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-utilities\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.459831 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-catalog-content\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.460553 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-catalog-content\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.460645 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-utilities\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.478475 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw9nb\" (UniqueName: \"kubernetes.io/projected/2a6824e3-65ec-404c-ac28-59fce8d50d83-kube-api-access-hw9nb\") pod \"redhat-operators-jnfk6\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.509505 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.644618 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.830133 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kt9vc"] Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.910690 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cb76749-e1ee-42c1-bcef-cb2ca1d793d2" path="/var/lib/kubelet/pods/0cb76749-e1ee-42c1-bcef-cb2ca1d793d2/volumes" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.934196 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" event={"ID":"58b17661-5628-4e68-aa9e-9d6e850b6dbe","Type":"ContainerStarted","Data":"1e918bf12b87f426964b59fc38aacd1dcc97b3e1020893f334bff29cf12fdd38"} Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.934358 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.941366 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5cee76a1-0ffe-4487-817c-725f160b4406","Type":"ContainerStarted","Data":"7edf0ed0747b5a957780d53f9f873cf8df04e93d466ddd1b5ffb5f495d3a98da"} Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.976277 4695 generic.go:334] "Generic (PLEG): container finished" podID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerID="9c4785c8b59b6752da47b7ed3c9a6793c11f1117df037f729145ea5097a1b990" exitCode=0 Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.976382 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerDied","Data":"9c4785c8b59b6752da47b7ed3c9a6793c11f1117df037f729145ea5097a1b990"} Mar 20 10:57:22 crc kubenswrapper[4695]: I0320 10:57:22.994745 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerStarted","Data":"6460a31bf6b18d6925e842d6f4cbc96a04c81eacb7e0790e588090999bab2134"} Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.007236 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-77ffd8674f-7mcv2"] Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.024487 4695 generic.go:334] "Generic (PLEG): container finished" podID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerID="b07be4cac708dfa83396768af9a3ab98c1fc9719042388a003f699e72f5f187b" exitCode=0 Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.024875 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerDied","Data":"b07be4cac708dfa83396768af9a3ab98c1fc9719042388a003f699e72f5f187b"} Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.065570 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27a7790b-af8c-474c-8486-60178b563a04","Type":"ContainerStarted","Data":"93e9fb68ef63d517f1b7eca99442fe12d3f11910e5db110fbc15da22e5a5db18"} Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.088678 4695 generic.go:334] "Generic (PLEG): container finished" podID="2eafc357-18f4-49a8-88be-d7e67ed800a0" containerID="91585b1f21b2154949883fb06e2da716982e430c256e2e5f55cf591cbbf69ba1" exitCode=0 Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.089154 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" event={"ID":"2eafc357-18f4-49a8-88be-d7e67ed800a0","Type":"ContainerDied","Data":"91585b1f21b2154949883fb06e2da716982e430c256e2e5f55cf591cbbf69ba1"} Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.097278 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz"] Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.099316 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.100650 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerStarted","Data":"91e103e42a0e829f6be656cfc924e846a4d7d3968b5da5158b6de0d329fa72ba"} Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.101045 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.124317 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz"] Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.177822 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9lpk\" (UniqueName: \"kubernetes.io/projected/257cb34c-eee5-496b-a2c6-2138ef026e42-kube-api-access-m9lpk\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.177969 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-config\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.178009 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cb34c-eee5-496b-a2c6-2138ef026e42-serving-cert\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.178042 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-client-ca\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.182823 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.182792918 podStartE2EDuration="3.182792918s" podCreationTimestamp="2026-03-20 10:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:23.174149595 +0000 UTC m=+220.954755188" watchObservedRunningTime="2026-03-20 10:57:23.182792918 +0000 UTC m=+220.963398481" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.184085 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jnfk6"] Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.264399 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" podStartSLOduration=177.26438108 podStartE2EDuration="2m57.26438108s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:23.245279538 +0000 UTC m=+221.025885121" watchObservedRunningTime="2026-03-20 10:57:23.26438108 +0000 UTC m=+221.044986643" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.281659 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9lpk\" (UniqueName: \"kubernetes.io/projected/257cb34c-eee5-496b-a2c6-2138ef026e42-kube-api-access-m9lpk\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.281791 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-config\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.281863 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cb34c-eee5-496b-a2c6-2138ef026e42-serving-cert\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.281986 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-client-ca\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.283589 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs"] Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.284378 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-config\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.285123 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-client-ca\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.296039 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cb34c-eee5-496b-a2c6-2138ef026e42-serving-cert\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.314843 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9lpk\" (UniqueName: \"kubernetes.io/projected/257cb34c-eee5-496b-a2c6-2138ef026e42-kube-api-access-m9lpk\") pod \"route-controller-manager-fdf66746b-v4bcz\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.319065 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-8ttcs"] Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.322507 4695 ???:1] "http: TLS handshake error from 192.168.126.11:56492: no serving certificate available for the kubelet" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.382955 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:23 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:23 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:23 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.383048 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.484530 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:23 crc kubenswrapper[4695]: I0320 10:57:23.921159 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz"] Mar 20 10:57:23 crc kubenswrapper[4695]: W0320 10:57:23.983322 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod257cb34c_eee5_496b_a2c6_2138ef026e42.slice/crio-a3f969bbd5c58c29e35dd90e70b51691a443238e49776c7761deb36549778a14 WatchSource:0}: Error finding container a3f969bbd5c58c29e35dd90e70b51691a443238e49776c7761deb36549778a14: Status 404 returned error can't find the container with id a3f969bbd5c58c29e35dd90e70b51691a443238e49776c7761deb36549778a14 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.119296 4695 generic.go:334] "Generic (PLEG): container finished" podID="27a7790b-af8c-474c-8486-60178b563a04" containerID="2431e46e35070e31f9c6f3498f41b621a923f47b79a905851c0e8ebc5eee47d6" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.119410 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27a7790b-af8c-474c-8486-60178b563a04","Type":"ContainerDied","Data":"2431e46e35070e31f9c6f3498f41b621a923f47b79a905851c0e8ebc5eee47d6"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.123486 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" event={"ID":"257cb34c-eee5-496b-a2c6-2138ef026e42","Type":"ContainerStarted","Data":"a3f969bbd5c58c29e35dd90e70b51691a443238e49776c7761deb36549778a14"} Mar 20 10:57:24 crc kubenswrapper[4695]: E0320 10:57:24.135208 4695 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod5cee76a1_0ffe_4487_817c_725f160b4406.slice/crio-conmon-410c38ad7e572fe8d58f31884bd8850dfaac35ea7323dc3057184858245ba8d1.scope\": RecentStats: unable to find data in memory cache]" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.139698 4695 generic.go:334] "Generic (PLEG): container finished" podID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerID="773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.139857 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerDied","Data":"773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.139899 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerStarted","Data":"1e9cbb31bbd2e8b0cc51af61268dffd96783521b37323cee50ed83dad62895dd"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.142840 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" event={"ID":"58b17661-5628-4e68-aa9e-9d6e850b6dbe","Type":"ContainerStarted","Data":"2209a6f1b0614ea5a119dcb626ef7ee51ef1f1b8dfc4210749065afcfe358e6c"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.169586 4695 generic.go:334] "Generic (PLEG): container finished" podID="5cee76a1-0ffe-4487-817c-725f160b4406" containerID="410c38ad7e572fe8d58f31884bd8850dfaac35ea7323dc3057184858245ba8d1" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.169715 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5cee76a1-0ffe-4487-817c-725f160b4406","Type":"ContainerDied","Data":"410c38ad7e572fe8d58f31884bd8850dfaac35ea7323dc3057184858245ba8d1"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.184805 4695 generic.go:334] "Generic (PLEG): container finished" podID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerID="d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.188808 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerDied","Data":"d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.205293 4695 generic.go:334] "Generic (PLEG): container finished" podID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerID="751f477e1fd16b2abcde0fd1298428c0718ce2b773e9cbb6887e8322497094f9" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.205460 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerDied","Data":"751f477e1fd16b2abcde0fd1298428c0718ce2b773e9cbb6887e8322497094f9"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.283119 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" event={"ID":"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209","Type":"ContainerStarted","Data":"c8ba2e02d27a4cbab7995f8c08ba6c5274fec83b92720fa385863b167108d67e"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.283191 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" event={"ID":"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209","Type":"ContainerStarted","Data":"8277ed1ffab2190a499e3962559c1a6a92bbdbbfc8fc390dc89f2575a24ac8c5"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.284213 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.292885 4695 generic.go:334] "Generic (PLEG): container finished" podID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerID="1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008" exitCode=0 Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.292991 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerDied","Data":"1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.293048 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerStarted","Data":"58a144ff3006720da3b38d6cc9128c7f419cbf4abcb6e3f04fde0f77df6759ca"} Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.300648 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.331591 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" podStartSLOduration=4.331557833 podStartE2EDuration="4.331557833s" podCreationTimestamp="2026-03-20 10:57:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:24.312095381 +0000 UTC m=+222.092700954" watchObservedRunningTime="2026-03-20 10:57:24.331557833 +0000 UTC m=+222.112163396" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.343863 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.354455 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-z8vn7" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.376181 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:24 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:24 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:24 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.376255 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.887499 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.924776 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eafc357-18f4-49a8-88be-d7e67ed800a0-secret-volume\") pod \"2eafc357-18f4-49a8-88be-d7e67ed800a0\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.924894 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eafc357-18f4-49a8-88be-d7e67ed800a0-config-volume\") pod \"2eafc357-18f4-49a8-88be-d7e67ed800a0\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.925193 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k828r\" (UniqueName: \"kubernetes.io/projected/2eafc357-18f4-49a8-88be-d7e67ed800a0-kube-api-access-k828r\") pod \"2eafc357-18f4-49a8-88be-d7e67ed800a0\" (UID: \"2eafc357-18f4-49a8-88be-d7e67ed800a0\") " Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.927257 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2eafc357-18f4-49a8-88be-d7e67ed800a0-config-volume" (OuterVolumeSpecName: "config-volume") pod "2eafc357-18f4-49a8-88be-d7e67ed800a0" (UID: "2eafc357-18f4-49a8-88be-d7e67ed800a0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.935565 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eafc357-18f4-49a8-88be-d7e67ed800a0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2eafc357-18f4-49a8-88be-d7e67ed800a0" (UID: "2eafc357-18f4-49a8-88be-d7e67ed800a0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.935565 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eafc357-18f4-49a8-88be-d7e67ed800a0-kube-api-access-k828r" (OuterVolumeSpecName: "kube-api-access-k828r") pod "2eafc357-18f4-49a8-88be-d7e67ed800a0" (UID: "2eafc357-18f4-49a8-88be-d7e67ed800a0"). InnerVolumeSpecName "kube-api-access-k828r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:24 crc kubenswrapper[4695]: I0320 10:57:24.938299 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c569b375-f808-4f3d-8f3d-a162677356ff" path="/var/lib/kubelet/pods/c569b375-f808-4f3d-8f3d-a162677356ff/volumes" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.030403 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k828r\" (UniqueName: \"kubernetes.io/projected/2eafc357-18f4-49a8-88be-d7e67ed800a0-kube-api-access-k828r\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.030446 4695 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2eafc357-18f4-49a8-88be-d7e67ed800a0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.030458 4695 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2eafc357-18f4-49a8-88be-d7e67ed800a0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.292670 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fxxgv" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.364751 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" event={"ID":"257cb34c-eee5-496b-a2c6-2138ef026e42","Type":"ContainerStarted","Data":"c5b6b458e709a1e714860e623ac759f65ec98325a44534a62d0a4dac8d0b9b88"} Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.366302 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.369719 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.370373 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2" event={"ID":"2eafc357-18f4-49a8-88be-d7e67ed800a0","Type":"ContainerDied","Data":"694569f74274f77741198de3b03d3a83a158867349b7b72b7de7fb478ca7e4e4"} Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.370409 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="694569f74274f77741198de3b03d3a83a158867349b7b72b7de7fb478ca7e4e4" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.379776 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:25 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:25 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:25 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.379834 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.428741 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" podStartSLOduration=4.428708739 podStartE2EDuration="4.428708739s" podCreationTimestamp="2026-03-20 10:57:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:25.427715543 +0000 UTC m=+223.208321106" watchObservedRunningTime="2026-03-20 10:57:25.428708739 +0000 UTC m=+223.209314302" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.506138 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.821574 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.878573 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27a7790b-af8c-474c-8486-60178b563a04-kubelet-dir\") pod \"27a7790b-af8c-474c-8486-60178b563a04\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.878832 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27a7790b-af8c-474c-8486-60178b563a04-kube-api-access\") pod \"27a7790b-af8c-474c-8486-60178b563a04\" (UID: \"27a7790b-af8c-474c-8486-60178b563a04\") " Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.880073 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27a7790b-af8c-474c-8486-60178b563a04-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "27a7790b-af8c-474c-8486-60178b563a04" (UID: "27a7790b-af8c-474c-8486-60178b563a04"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.890572 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27a7790b-af8c-474c-8486-60178b563a04-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "27a7790b-af8c-474c-8486-60178b563a04" (UID: "27a7790b-af8c-474c-8486-60178b563a04"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.976779 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.982098 4695 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27a7790b-af8c-474c-8486-60178b563a04-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:25 crc kubenswrapper[4695]: I0320 10:57:25.982140 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/27a7790b-af8c-474c-8486-60178b563a04-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.082768 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cee76a1-0ffe-4487-817c-725f160b4406-kube-api-access\") pod \"5cee76a1-0ffe-4487-817c-725f160b4406\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.082963 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cee76a1-0ffe-4487-817c-725f160b4406-kubelet-dir\") pod \"5cee76a1-0ffe-4487-817c-725f160b4406\" (UID: \"5cee76a1-0ffe-4487-817c-725f160b4406\") " Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.083446 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5cee76a1-0ffe-4487-817c-725f160b4406-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5cee76a1-0ffe-4487-817c-725f160b4406" (UID: "5cee76a1-0ffe-4487-817c-725f160b4406"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.090170 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cee76a1-0ffe-4487-817c-725f160b4406-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5cee76a1-0ffe-4487-817c-725f160b4406" (UID: "5cee76a1-0ffe-4487-817c-725f160b4406"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.185034 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5cee76a1-0ffe-4487-817c-725f160b4406-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.185077 4695 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5cee76a1-0ffe-4487-817c-725f160b4406-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.381068 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:26 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:26 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:26 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.381158 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.420387 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5cee76a1-0ffe-4487-817c-725f160b4406","Type":"ContainerDied","Data":"7edf0ed0747b5a957780d53f9f873cf8df04e93d466ddd1b5ffb5f495d3a98da"} Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.420441 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7edf0ed0747b5a957780d53f9f873cf8df04e93d466ddd1b5ffb5f495d3a98da" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.420548 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.430109 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"27a7790b-af8c-474c-8486-60178b563a04","Type":"ContainerDied","Data":"93e9fb68ef63d517f1b7eca99442fe12d3f11910e5db110fbc15da22e5a5db18"} Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.430192 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93e9fb68ef63d517f1b7eca99442fe12d3f11910e5db110fbc15da22e5a5db18" Mar 20 10:57:26 crc kubenswrapper[4695]: I0320 10:57:26.430215 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:57:27 crc kubenswrapper[4695]: I0320 10:57:27.281121 4695 ???:1] "http: TLS handshake error from 192.168.126.11:53766: no serving certificate available for the kubelet" Mar 20 10:57:27 crc kubenswrapper[4695]: I0320 10:57:27.377418 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:27 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:27 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:27 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:27 crc kubenswrapper[4695]: I0320 10:57:27.377503 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:28 crc kubenswrapper[4695]: I0320 10:57:28.375529 4695 patch_prober.go:28] interesting pod/router-default-5444994796-gnknn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:57:28 crc kubenswrapper[4695]: [-]has-synced failed: reason withheld Mar 20 10:57:28 crc kubenswrapper[4695]: [+]process-running ok Mar 20 10:57:28 crc kubenswrapper[4695]: healthz check failed Mar 20 10:57:28 crc kubenswrapper[4695]: I0320 10:57:28.376096 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gnknn" podUID="9e14b497-1345-4352-9b10-10deaeb1f6ef" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.230937 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.231022 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.231065 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.231153 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.374816 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.378731 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gnknn" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.408351 4695 patch_prober.go:28] interesting pod/console-f9d7485db-s2xcj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.408738 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-s2xcj" podUID="874d0ff7-4923-4423-920a-59e6a632507a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.583015 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.617197 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e0468323-460e-4bf3-be74-9c2330bde834-metrics-certs\") pod \"network-metrics-daemon-h5s76\" (UID: \"e0468323-460e-4bf3-be74-9c2330bde834\") " pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:57:29 crc kubenswrapper[4695]: I0320 10:57:29.803738 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h5s76" Mar 20 10:57:33 crc kubenswrapper[4695]: I0320 10:57:33.593077 4695 ???:1] "http: TLS handshake error from 192.168.126.11:53770: no serving certificate available for the kubelet" Mar 20 10:57:38 crc kubenswrapper[4695]: I0320 10:57:38.441134 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:57:38 crc kubenswrapper[4695]: I0320 10:57:38.441600 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:57:38 crc kubenswrapper[4695]: I0320 10:57:38.709963 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77ffd8674f-7mcv2"] Mar 20 10:57:38 crc kubenswrapper[4695]: I0320 10:57:38.710287 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerName="controller-manager" containerID="cri-o://c8ba2e02d27a4cbab7995f8c08ba6c5274fec83b92720fa385863b167108d67e" gracePeriod=30 Mar 20 10:57:38 crc kubenswrapper[4695]: I0320 10:57:38.733507 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz"] Mar 20 10:57:38 crc kubenswrapper[4695]: I0320 10:57:38.733873 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerName="route-controller-manager" containerID="cri-o://c5b6b458e709a1e714860e623ac759f65ec98325a44534a62d0a4dac8d0b9b88" gracePeriod=30 Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.228417 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.228481 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.228552 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.228652 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.228751 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.229614 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.229713 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.229765 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"3d577d53339f614c718f930af9e8ca507db12b381b72ffd58f59c01bf0ccd16e"} pod="openshift-console/downloads-7954f5f757-dzwsk" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.229828 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" containerID="cri-o://3d577d53339f614c718f930af9e8ca507db12b381b72ffd58f59c01bf0ccd16e" gracePeriod=2 Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.412900 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:39 crc kubenswrapper[4695]: I0320 10:57:39.416521 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 10:57:40 crc kubenswrapper[4695]: I0320 10:57:40.912543 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 10:57:41 crc kubenswrapper[4695]: I0320 10:57:41.703845 4695 generic.go:334] "Generic (PLEG): container finished" podID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerID="c8ba2e02d27a4cbab7995f8c08ba6c5274fec83b92720fa385863b167108d67e" exitCode=0 Mar 20 10:57:41 crc kubenswrapper[4695]: I0320 10:57:41.703939 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" event={"ID":"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209","Type":"ContainerDied","Data":"c8ba2e02d27a4cbab7995f8c08ba6c5274fec83b92720fa385863b167108d67e"} Mar 20 10:57:42 crc kubenswrapper[4695]: I0320 10:57:42.511891 4695 patch_prober.go:28] interesting pod/controller-manager-77ffd8674f-7mcv2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 20 10:57:42 crc kubenswrapper[4695]: I0320 10:57:42.512061 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 20 10:57:43 crc kubenswrapper[4695]: I0320 10:57:43.486306 4695 patch_prober.go:28] interesting pod/route-controller-manager-fdf66746b-v4bcz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 20 10:57:43 crc kubenswrapper[4695]: I0320 10:57:43.486838 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 20 10:57:49 crc kubenswrapper[4695]: I0320 10:57:49.229862 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:49 crc kubenswrapper[4695]: I0320 10:57:49.230464 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:50 crc kubenswrapper[4695]: I0320 10:57:50.153617 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xc2wd" Mar 20 10:57:52 crc kubenswrapper[4695]: I0320 10:57:52.510809 4695 patch_prober.go:28] interesting pod/controller-manager-77ffd8674f-7mcv2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 20 10:57:52 crc kubenswrapper[4695]: I0320 10:57:52.510886 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 20 10:57:53 crc kubenswrapper[4695]: I0320 10:57:53.485309 4695 patch_prober.go:28] interesting pod/route-controller-manager-fdf66746b-v4bcz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" start-of-body= Mar 20 10:57:53 crc kubenswrapper[4695]: I0320 10:57:53.485393 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.56:8443/healthz\": dial tcp 10.217.0.56:8443: connect: connection refused" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.877596 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:57:54 crc kubenswrapper[4695]: E0320 10:57:54.878346 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cee76a1-0ffe-4487-817c-725f160b4406" containerName="pruner" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.878362 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cee76a1-0ffe-4487-817c-725f160b4406" containerName="pruner" Mar 20 10:57:54 crc kubenswrapper[4695]: E0320 10:57:54.878381 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eafc357-18f4-49a8-88be-d7e67ed800a0" containerName="collect-profiles" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.878387 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eafc357-18f4-49a8-88be-d7e67ed800a0" containerName="collect-profiles" Mar 20 10:57:54 crc kubenswrapper[4695]: E0320 10:57:54.878407 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27a7790b-af8c-474c-8486-60178b563a04" containerName="pruner" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.878415 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="27a7790b-af8c-474c-8486-60178b563a04" containerName="pruner" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.878526 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cee76a1-0ffe-4487-817c-725f160b4406" containerName="pruner" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.878541 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="27a7790b-af8c-474c-8486-60178b563a04" containerName="pruner" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.878552 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eafc357-18f4-49a8-88be-d7e67ed800a0" containerName="collect-profiles" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.879136 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.881878 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.892617 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.898390 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.981294 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:54 crc kubenswrapper[4695]: I0320 10:57:54.981363 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.082603 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.082668 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.083377 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.105811 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.203182 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.784865 4695 generic.go:334] "Generic (PLEG): container finished" podID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerID="c5b6b458e709a1e714860e623ac759f65ec98325a44534a62d0a4dac8d0b9b88" exitCode=0 Mar 20 10:57:55 crc kubenswrapper[4695]: I0320 10:57:55.785469 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" event={"ID":"257cb34c-eee5-496b-a2c6-2138ef026e42","Type":"ContainerDied","Data":"c5b6b458e709a1e714860e623ac759f65ec98325a44534a62d0a4dac8d0b9b88"} Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.235315 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.236405 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:57:59 crc kubenswrapper[4695]: E0320 10:57:59.820239 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3222274338/3\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:57:59 crc kubenswrapper[4695]: E0320 10:57:59.820606 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4wl4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-kt9vc_openshift-marketplace(53830966-0b62-40fe-9f81-c18c95ea50aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3222274338/3\": happened during read: context canceled" logger="UnhandledError" Mar 20 10:57:59 crc kubenswrapper[4695]: E0320 10:57:59.821784 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3222274338/3\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-kt9vc" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.823508 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" event={"ID":"257cb34c-eee5-496b-a2c6-2138ef026e42","Type":"ContainerDied","Data":"a3f969bbd5c58c29e35dd90e70b51691a443238e49776c7761deb36549778a14"} Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.823570 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3f969bbd5c58c29e35dd90e70b51691a443238e49776c7761deb36549778a14" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.827051 4695 generic.go:334] "Generic (PLEG): container finished" podID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerID="3d577d53339f614c718f930af9e8ca507db12b381b72ffd58f59c01bf0ccd16e" exitCode=0 Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.827087 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dzwsk" event={"ID":"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564","Type":"ContainerDied","Data":"3d577d53339f614c718f930af9e8ca507db12b381b72ffd58f59c01bf0ccd16e"} Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.829160 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" event={"ID":"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209","Type":"ContainerDied","Data":"8277ed1ffab2190a499e3962559c1a6a92bbdbbfc8fc390dc89f2575a24ac8c5"} Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.829212 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8277ed1ffab2190a499e3962559c1a6a92bbdbbfc8fc390dc89f2575a24ac8c5" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.834811 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.840129 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972062 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-serving-cert\") pod \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972467 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-config\") pod \"257cb34c-eee5-496b-a2c6-2138ef026e42\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972510 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-proxy-ca-bundles\") pod \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972538 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cb34c-eee5-496b-a2c6-2138ef026e42-serving-cert\") pod \"257cb34c-eee5-496b-a2c6-2138ef026e42\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972560 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-client-ca\") pod \"257cb34c-eee5-496b-a2c6-2138ef026e42\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972580 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw859\" (UniqueName: \"kubernetes.io/projected/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-kube-api-access-dw859\") pod \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972611 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-client-ca\") pod \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972635 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-config\") pod \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\" (UID: \"5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.972651 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9lpk\" (UniqueName: \"kubernetes.io/projected/257cb34c-eee5-496b-a2c6-2138ef026e42-kube-api-access-m9lpk\") pod \"257cb34c-eee5-496b-a2c6-2138ef026e42\" (UID: \"257cb34c-eee5-496b-a2c6-2138ef026e42\") " Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.974079 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-client-ca" (OuterVolumeSpecName: "client-ca") pod "257cb34c-eee5-496b-a2c6-2138ef026e42" (UID: "257cb34c-eee5-496b-a2c6-2138ef026e42"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.974205 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-config" (OuterVolumeSpecName: "config") pod "257cb34c-eee5-496b-a2c6-2138ef026e42" (UID: "257cb34c-eee5-496b-a2c6-2138ef026e42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.974645 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" (UID: "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.975004 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-client-ca" (OuterVolumeSpecName: "client-ca") pod "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" (UID: "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.975082 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-config" (OuterVolumeSpecName: "config") pod "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" (UID: "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.980101 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" (UID: "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.980161 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/257cb34c-eee5-496b-a2c6-2138ef026e42-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "257cb34c-eee5-496b-a2c6-2138ef026e42" (UID: "257cb34c-eee5-496b-a2c6-2138ef026e42"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.980181 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-kube-api-access-dw859" (OuterVolumeSpecName: "kube-api-access-dw859") pod "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" (UID: "5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209"). InnerVolumeSpecName "kube-api-access-dw859". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:57:59 crc kubenswrapper[4695]: I0320 10:57:59.980935 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257cb34c-eee5-496b-a2c6-2138ef026e42-kube-api-access-m9lpk" (OuterVolumeSpecName: "kube-api-access-m9lpk") pod "257cb34c-eee5-496b-a2c6-2138ef026e42" (UID: "257cb34c-eee5-496b-a2c6-2138ef026e42"). InnerVolumeSpecName "kube-api-access-m9lpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075273 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075325 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cb34c-eee5-496b-a2c6-2138ef026e42-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075339 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075351 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw859\" (UniqueName: \"kubernetes.io/projected/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-kube-api-access-dw859\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075368 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075380 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075397 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9lpk\" (UniqueName: \"kubernetes.io/projected/257cb34c-eee5-496b-a2c6-2138ef026e42-kube-api-access-m9lpk\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075407 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.075418 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257cb34c-eee5-496b-a2c6-2138ef026e42-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.092181 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674466785-268m7"] Mar 20 10:58:00 crc kubenswrapper[4695]: E0320 10:58:00.092552 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerName="route-controller-manager" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.092572 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerName="route-controller-manager" Mar 20 10:58:00 crc kubenswrapper[4695]: E0320 10:58:00.092585 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerName="controller-manager" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.092591 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerName="controller-manager" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.092752 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" containerName="controller-manager" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.092773 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" containerName="route-controller-manager" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.093510 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.105722 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674466785-268m7"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.144340 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566738-dkbr4"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.145964 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.150451 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.160883 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-dkbr4"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.273480 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.274381 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.278898 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-serving-cert\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.278985 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dvbm\" (UniqueName: \"kubernetes.io/projected/e3280710-4b29-43c1-8f83-4c18670a9e0a-kube-api-access-9dvbm\") pod \"auto-csr-approver-29566738-dkbr4\" (UID: \"e3280710-4b29-43c1-8f83-4c18670a9e0a\") " pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.279126 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-client-ca\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.279569 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-config\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.279711 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4jms\" (UniqueName: \"kubernetes.io/projected/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-kube-api-access-c4jms\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.283610 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.381776 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-config\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.381877 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4jms\" (UniqueName: \"kubernetes.io/projected/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-kube-api-access-c4jms\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.381949 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-serving-cert\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.381988 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dvbm\" (UniqueName: \"kubernetes.io/projected/e3280710-4b29-43c1-8f83-4c18670a9e0a-kube-api-access-9dvbm\") pod \"auto-csr-approver-29566738-dkbr4\" (UID: \"e3280710-4b29-43c1-8f83-4c18670a9e0a\") " pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.382019 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kube-api-access\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.382045 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-var-lock\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.382081 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.382119 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-client-ca\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.382940 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-client-ca\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.383020 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-config\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.397268 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-serving-cert\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.399871 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4jms\" (UniqueName: \"kubernetes.io/projected/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-kube-api-access-c4jms\") pod \"route-controller-manager-7674466785-268m7\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.400343 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dvbm\" (UniqueName: \"kubernetes.io/projected/e3280710-4b29-43c1-8f83-4c18670a9e0a-kube-api-access-9dvbm\") pod \"auto-csr-approver-29566738-dkbr4\" (UID: \"e3280710-4b29-43c1-8f83-4c18670a9e0a\") " pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.460109 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.468762 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.484856 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kube-api-access\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.484958 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-var-lock\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.485012 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.485194 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kubelet-dir\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.485200 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-var-lock\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.504016 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kube-api-access\") pod \"installer-9-crc\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.617461 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.834849 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-77ffd8674f-7mcv2" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.834849 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz" Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.896789 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-77ffd8674f-7mcv2"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.897636 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-77ffd8674f-7mcv2"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.904033 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz"] Mar 20 10:58:00 crc kubenswrapper[4695]: I0320 10:58:00.908609 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdf66746b-v4bcz"] Mar 20 10:58:00 crc kubenswrapper[4695]: E0320 10:58:00.961133 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-kt9vc" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" Mar 20 10:58:01 crc kubenswrapper[4695]: E0320 10:58:01.000705 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 10:58:01 crc kubenswrapper[4695]: E0320 10:58:01.000937 4695 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:58:01 crc kubenswrapper[4695]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 10:58:01 crc kubenswrapper[4695]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wlw29,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566736-t45k7_openshift-infra(b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 10:58:01 crc kubenswrapper[4695]: > logger="UnhandledError" Mar 20 10:58:01 crc kubenswrapper[4695]: E0320 10:58:01.003088 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566736-t45k7" podUID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" Mar 20 10:58:01 crc kubenswrapper[4695]: E0320 10:58:01.851286 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566736-t45k7" podUID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.093889 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-54666897bb-f56tw"] Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.095171 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.099006 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.099468 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.099010 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.099638 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.100082 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.102948 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.108475 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.119286 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-config\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.119337 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xczf4\" (UniqueName: \"kubernetes.io/projected/228abe7a-c73b-42db-8162-7a3b424eef42-kube-api-access-xczf4\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.119458 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-client-ca\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.119552 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-proxy-ca-bundles\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.119628 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228abe7a-c73b-42db-8162-7a3b424eef42-serving-cert\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.121558 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-54666897bb-f56tw"] Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.220578 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-client-ca\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.221103 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-proxy-ca-bundles\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.221162 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228abe7a-c73b-42db-8162-7a3b424eef42-serving-cert\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.221218 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-config\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.221257 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xczf4\" (UniqueName: \"kubernetes.io/projected/228abe7a-c73b-42db-8162-7a3b424eef42-kube-api-access-xczf4\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.223250 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-client-ca\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.223310 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-proxy-ca-bundles\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.223563 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-config\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.232471 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228abe7a-c73b-42db-8162-7a3b424eef42-serving-cert\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.241934 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xczf4\" (UniqueName: \"kubernetes.io/projected/228abe7a-c73b-42db-8162-7a3b424eef42-kube-api-access-xczf4\") pod \"controller-manager-54666897bb-f56tw\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.451372 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.894072 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257cb34c-eee5-496b-a2c6-2138ef026e42" path="/var/lib/kubelet/pods/257cb34c-eee5-496b-a2c6-2138ef026e42/volumes" Mar 20 10:58:02 crc kubenswrapper[4695]: I0320 10:58:02.894993 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209" path="/var/lib/kubelet/pods/5f7d572b-9ae9-4fd6-bdc2-9c56b9d4b209/volumes" Mar 20 10:58:07 crc kubenswrapper[4695]: E0320 10:58:07.895360 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:58:07 crc kubenswrapper[4695]: E0320 10:58:07.896388 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rxcph,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-28xd8_openshift-marketplace(da83bf65-5995-41cf-8f79-98a77e0ace2e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:07 crc kubenswrapper[4695]: E0320 10:58:07.897575 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-28xd8" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.276697 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r82b4"] Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.430800 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.430892 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.430985 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.431713 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.431791 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3" gracePeriod=600 Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.899433 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3" exitCode=0 Mar 20 10:58:08 crc kubenswrapper[4695]: I0320 10:58:08.901355 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3"} Mar 20 10:58:09 crc kubenswrapper[4695]: I0320 10:58:09.228492 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:09 crc kubenswrapper[4695]: I0320 10:58:09.228571 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:10 crc kubenswrapper[4695]: E0320 10:58:10.186499 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-28xd8" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" Mar 20 10:58:10 crc kubenswrapper[4695]: E0320 10:58:10.254355 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:58:10 crc kubenswrapper[4695]: E0320 10:58:10.254938 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lppf4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-6fqn2_openshift-marketplace(33f393cc-11cf-4c7a-aeac-8423998e5dc6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:10 crc kubenswrapper[4695]: E0320 10:58:10.256166 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-6fqn2" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" Mar 20 10:58:14 crc kubenswrapper[4695]: I0320 10:58:14.583320 4695 ???:1] "http: TLS handshake error from 192.168.126.11:53976: no serving certificate available for the kubelet" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.745148 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-6fqn2" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.840090 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.840382 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w6786,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mzrcb_openshift-marketplace(e96190cb-8d03-4cb7-b3f6-6b46a141f969): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.841550 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-mzrcb" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.852051 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.852258 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hw9nb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jnfk6_openshift-marketplace(2a6824e3-65ec-404c-ac28-59fce8d50d83): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.853423 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jnfk6" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.865341 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.865529 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5c4mc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-shq4g_openshift-marketplace(eb233657-545c-4a0b-93a0-b29148b5cb3f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:15 crc kubenswrapper[4695]: E0320 10:58:15.866779 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-shq4g" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" Mar 20 10:58:16 crc kubenswrapper[4695]: I0320 10:58:16.032299 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h5s76"] Mar 20 10:58:18 crc kubenswrapper[4695]: I0320 10:58:18.688306 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54666897bb-f56tw"] Mar 20 10:58:18 crc kubenswrapper[4695]: I0320 10:58:18.711364 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674466785-268m7"] Mar 20 10:58:19 crc kubenswrapper[4695]: I0320 10:58:19.228419 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:19 crc kubenswrapper[4695]: I0320 10:58:19.228492 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:20 crc kubenswrapper[4695]: E0320 10:58:20.276695 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jnfk6" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" Mar 20 10:58:20 crc kubenswrapper[4695]: E0320 10:58:20.277637 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mzrcb" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" Mar 20 10:58:20 crc kubenswrapper[4695]: E0320 10:58:20.277740 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-shq4g" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" Mar 20 10:58:20 crc kubenswrapper[4695]: I0320 10:58:20.778778 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54666897bb-f56tw"] Mar 20 10:58:20 crc kubenswrapper[4695]: W0320 10:58:20.793451 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod228abe7a_c73b_42db_8162_7a3b424eef42.slice/crio-0d360bd6624916fcdf400bb43639faef4b58267a0e05db440710de7939ae7e38 WatchSource:0}: Error finding container 0d360bd6624916fcdf400bb43639faef4b58267a0e05db440710de7939ae7e38: Status 404 returned error can't find the container with id 0d360bd6624916fcdf400bb43639faef4b58267a0e05db440710de7939ae7e38 Mar 20 10:58:20 crc kubenswrapper[4695]: I0320 10:58:20.813498 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:58:20 crc kubenswrapper[4695]: I0320 10:58:20.878852 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674466785-268m7"] Mar 20 10:58:20 crc kubenswrapper[4695]: I0320 10:58:20.899097 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-dkbr4"] Mar 20 10:58:20 crc kubenswrapper[4695]: I0320 10:58:20.960304 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:58:21 crc kubenswrapper[4695]: I0320 10:58:21.014342 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" event={"ID":"228abe7a-c73b-42db-8162-7a3b424eef42","Type":"ContainerStarted","Data":"0d360bd6624916fcdf400bb43639faef4b58267a0e05db440710de7939ae7e38"} Mar 20 10:58:21 crc kubenswrapper[4695]: I0320 10:58:21.021760 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" event={"ID":"e3280710-4b29-43c1-8f83-4c18670a9e0a","Type":"ContainerStarted","Data":"93586f846b4b452fe26bde53ddc06e5d78f06aafe418eda7d2f44947d95ca84d"} Mar 20 10:58:21 crc kubenswrapper[4695]: I0320 10:58:21.024745 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5s76" event={"ID":"e0468323-460e-4bf3-be74-9c2330bde834","Type":"ContainerStarted","Data":"1380f60973f56bd11a4e3abfbc3d54cc9f02e8e99522176963b43f7875f357bc"} Mar 20 10:58:21 crc kubenswrapper[4695]: I0320 10:58:21.027436 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a82b7512-ebb9-42df-aff3-e8a7077b0fcf","Type":"ContainerStarted","Data":"360cd8e879fb363c0ce407fc88c4d6441f6d1b7e550ee2e9e693bc9b1a9733df"} Mar 20 10:58:21 crc kubenswrapper[4695]: I0320 10:58:21.029579 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" event={"ID":"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa","Type":"ContainerStarted","Data":"e21331722c94e05687f25c353c5f89dba5dcd1a3eab7a3f89f66738698d77bf2"} Mar 20 10:58:22 crc kubenswrapper[4695]: I0320 10:58:22.037501 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5s76" event={"ID":"e0468323-460e-4bf3-be74-9c2330bde834","Type":"ContainerStarted","Data":"ba8e005dbd613cda97d8f022d0117b3b6abc0f24d39a93f4f4270e22761c561e"} Mar 20 10:58:22 crc kubenswrapper[4695]: I0320 10:58:22.039335 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-dzwsk" event={"ID":"acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564","Type":"ContainerStarted","Data":"716594e808c70a6c43cdd439ef3f4f7c953917a360b8b07093fa1d9ca471071d"} Mar 20 10:58:22 crc kubenswrapper[4695]: I0320 10:58:22.040200 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ac97aea2-7502-4a8e-a2da-305c88bd2f3a","Type":"ContainerStarted","Data":"41b1900b01ad99043c9d53822cbed5833950341369db4c2f3bf3d13b3f3227d3"} Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.051268 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a82b7512-ebb9-42df-aff3-e8a7077b0fcf","Type":"ContainerStarted","Data":"2cd830d4896439d024a1e2a77b1f538d3fae2bebae9f4a05964503da39f2ef5c"} Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.053569 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"ec82d0faf7e4964b116a15802965ff707a29f15669e56ffaba0e38d32bd99a78"} Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.054847 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" event={"ID":"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa","Type":"ContainerStarted","Data":"ead18e6d5429ea8e0279bccb3b6ddcf8d81e08603bbdb173e48f2c1684a31484"} Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.055036 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" podUID="7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" containerName="route-controller-manager" containerID="cri-o://ead18e6d5429ea8e0279bccb3b6ddcf8d81e08603bbdb173e48f2c1684a31484" gracePeriod=30 Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.055845 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.064123 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.066953 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" event={"ID":"228abe7a-c73b-42db-8162-7a3b424eef42","Type":"ContainerStarted","Data":"b47f0210f305b7ccf33acac8e11734b43f0c00a76d7232fe0878241e41b50cc1"} Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.079508 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ac97aea2-7502-4a8e-a2da-305c88bd2f3a","Type":"ContainerStarted","Data":"f5f28fba41cb01df696c37dbe5a2ae04875dc1b337bd1b1ba1926ea62527c4c5"} Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.079553 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.079585 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.079621 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:23 crc kubenswrapper[4695]: E0320 10:58:23.084495 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:58:23 crc kubenswrapper[4695]: E0320 10:58:23.084642 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5h4f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6mw7h_openshift-marketplace(d2ea4f1f-16e3-4ac7-ac16-f782b94669ff): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:23 crc kubenswrapper[4695]: E0320 10:58:23.086835 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6mw7h" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" Mar 20 10:58:23 crc kubenswrapper[4695]: I0320 10:58:23.108023 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" podStartSLOduration=25.107986655 podStartE2EDuration="25.107986655s" podCreationTimestamp="2026-03-20 10:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:23.09067363 +0000 UTC m=+280.871279193" watchObservedRunningTime="2026-03-20 10:58:23.107986655 +0000 UTC m=+280.888592218" Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.093691 4695 generic.go:334] "Generic (PLEG): container finished" podID="7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" containerID="ead18e6d5429ea8e0279bccb3b6ddcf8d81e08603bbdb173e48f2c1684a31484" exitCode=0 Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.094009 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" event={"ID":"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa","Type":"ContainerDied","Data":"ead18e6d5429ea8e0279bccb3b6ddcf8d81e08603bbdb173e48f2c1684a31484"} Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.097992 4695 generic.go:334] "Generic (PLEG): container finished" podID="ac97aea2-7502-4a8e-a2da-305c88bd2f3a" containerID="f5f28fba41cb01df696c37dbe5a2ae04875dc1b337bd1b1ba1926ea62527c4c5" exitCode=0 Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.098080 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ac97aea2-7502-4a8e-a2da-305c88bd2f3a","Type":"ContainerDied","Data":"f5f28fba41cb01df696c37dbe5a2ae04875dc1b337bd1b1ba1926ea62527c4c5"} Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.103009 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" podUID="228abe7a-c73b-42db-8162-7a3b424eef42" containerName="controller-manager" containerID="cri-o://b47f0210f305b7ccf33acac8e11734b43f0c00a76d7232fe0878241e41b50cc1" gracePeriod=30 Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.105702 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:24 crc kubenswrapper[4695]: E0320 10:58:24.109625 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6mw7h" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.110031 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.110114 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.110578 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.159378 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" podStartSLOduration=26.159348375 podStartE2EDuration="26.159348375s" podCreationTimestamp="2026-03-20 10:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.159330925 +0000 UTC m=+281.939936508" watchObservedRunningTime="2026-03-20 10:58:24.159348375 +0000 UTC m=+281.939953958" Mar 20 10:58:24 crc kubenswrapper[4695]: I0320 10:58:24.258788 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=24.258760288 podStartE2EDuration="24.258760288s" podCreationTimestamp="2026-03-20 10:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:24.256191943 +0000 UTC m=+282.036797506" watchObservedRunningTime="2026-03-20 10:58:24.258760288 +0000 UTC m=+282.039365851" Mar 20 10:58:24 crc kubenswrapper[4695]: E0320 10:58:24.358518 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:58:24 crc kubenswrapper[4695]: E0320 10:58:24.358773 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-stks6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-p78dm_openshift-marketplace(0b5f000a-cdbc-486a-9e77-d3bf68046cb7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:58:24 crc kubenswrapper[4695]: E0320 10:58:24.360028 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-p78dm" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.056401 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.104634 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6"] Mar 20 10:58:25 crc kubenswrapper[4695]: E0320 10:58:25.106236 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" containerName="route-controller-manager" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.106256 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" containerName="route-controller-manager" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.106392 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" containerName="route-controller-manager" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.106990 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.118218 4695 generic.go:334] "Generic (PLEG): container finished" podID="228abe7a-c73b-42db-8162-7a3b424eef42" containerID="b47f0210f305b7ccf33acac8e11734b43f0c00a76d7232fe0878241e41b50cc1" exitCode=0 Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.118322 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" event={"ID":"228abe7a-c73b-42db-8162-7a3b424eef42","Type":"ContainerDied","Data":"b47f0210f305b7ccf33acac8e11734b43f0c00a76d7232fe0878241e41b50cc1"} Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.127332 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6"] Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.127760 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h5s76" event={"ID":"e0468323-460e-4bf3-be74-9c2330bde834","Type":"ContainerStarted","Data":"b3d47166356213e368302015e7382912bdea9f79c988d6da052f059d008a9a22"} Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.130518 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" event={"ID":"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa","Type":"ContainerDied","Data":"e21331722c94e05687f25c353c5f89dba5dcd1a3eab7a3f89f66738698d77bf2"} Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.130584 4695 scope.go:117] "RemoveContainer" containerID="ead18e6d5429ea8e0279bccb3b6ddcf8d81e08603bbdb173e48f2c1684a31484" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.130784 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7674466785-268m7" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.162200 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h5s76" podStartSLOduration=239.162173602 podStartE2EDuration="3m59.162173602s" podCreationTimestamp="2026-03-20 10:54:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:25.157879933 +0000 UTC m=+282.938485516" watchObservedRunningTime="2026-03-20 10:58:25.162173602 +0000 UTC m=+282.942779165" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.192947 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-serving-cert\") pod \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.193408 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4jms\" (UniqueName: \"kubernetes.io/projected/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-kube-api-access-c4jms\") pod \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.193543 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-config\") pod \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.193599 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-client-ca\") pod \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\" (UID: \"7ef41a3d-8fcb-4013-8a20-2e9e39e152aa\") " Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.194457 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-config" (OuterVolumeSpecName: "config") pod "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" (UID: "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.194797 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.195009 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-client-ca" (OuterVolumeSpecName: "client-ca") pod "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" (UID: "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.201026 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" (UID: "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.217322 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-kube-api-access-c4jms" (OuterVolumeSpecName: "kube-api-access-c4jms") pod "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" (UID: "7ef41a3d-8fcb-4013-8a20-2e9e39e152aa"). InnerVolumeSpecName "kube-api-access-c4jms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.296307 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-client-ca\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.296440 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/887ffd02-2073-4750-b7ff-40d9f4c3dc93-serving-cert\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.297556 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-config\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.297903 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr4xk\" (UniqueName: \"kubernetes.io/projected/887ffd02-2073-4750-b7ff-40d9f4c3dc93-kube-api-access-hr4xk\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.298180 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.298209 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4jms\" (UniqueName: \"kubernetes.io/projected/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-kube-api-access-c4jms\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.298226 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.399504 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-config\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.399606 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr4xk\" (UniqueName: \"kubernetes.io/projected/887ffd02-2073-4750-b7ff-40d9f4c3dc93-kube-api-access-hr4xk\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.399681 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-client-ca\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.399725 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/887ffd02-2073-4750-b7ff-40d9f4c3dc93-serving-cert\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.400865 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-client-ca\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.401148 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-config\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.406557 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/887ffd02-2073-4750-b7ff-40d9f4c3dc93-serving-cert\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.421331 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr4xk\" (UniqueName: \"kubernetes.io/projected/887ffd02-2073-4750-b7ff-40d9f4c3dc93-kube-api-access-hr4xk\") pod \"route-controller-manager-594999bcf4-f85c6\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.426298 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.466015 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674466785-268m7"] Mar 20 10:58:25 crc kubenswrapper[4695]: I0320 10:58:25.469674 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7674466785-268m7"] Mar 20 10:58:26 crc kubenswrapper[4695]: E0320 10:58:26.251977 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-p78dm" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.305893 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.310004 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.414076 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-client-ca\") pod \"228abe7a-c73b-42db-8162-7a3b424eef42\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.414192 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-config\") pod \"228abe7a-c73b-42db-8162-7a3b424eef42\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415035 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kube-api-access\") pod \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415088 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kubelet-dir\") pod \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\" (UID: \"ac97aea2-7502-4a8e-a2da-305c88bd2f3a\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415110 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-proxy-ca-bundles\") pod \"228abe7a-c73b-42db-8162-7a3b424eef42\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415203 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228abe7a-c73b-42db-8162-7a3b424eef42-serving-cert\") pod \"228abe7a-c73b-42db-8162-7a3b424eef42\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415262 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xczf4\" (UniqueName: \"kubernetes.io/projected/228abe7a-c73b-42db-8162-7a3b424eef42-kube-api-access-xczf4\") pod \"228abe7a-c73b-42db-8162-7a3b424eef42\" (UID: \"228abe7a-c73b-42db-8162-7a3b424eef42\") " Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415361 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ac97aea2-7502-4a8e-a2da-305c88bd2f3a" (UID: "ac97aea2-7502-4a8e-a2da-305c88bd2f3a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415727 4695 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415355 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-client-ca" (OuterVolumeSpecName: "client-ca") pod "228abe7a-c73b-42db-8162-7a3b424eef42" (UID: "228abe7a-c73b-42db-8162-7a3b424eef42"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.415848 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-config" (OuterVolumeSpecName: "config") pod "228abe7a-c73b-42db-8162-7a3b424eef42" (UID: "228abe7a-c73b-42db-8162-7a3b424eef42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.416056 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "228abe7a-c73b-42db-8162-7a3b424eef42" (UID: "228abe7a-c73b-42db-8162-7a3b424eef42"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.421201 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ac97aea2-7502-4a8e-a2da-305c88bd2f3a" (UID: "ac97aea2-7502-4a8e-a2da-305c88bd2f3a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.421704 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/228abe7a-c73b-42db-8162-7a3b424eef42-kube-api-access-xczf4" (OuterVolumeSpecName: "kube-api-access-xczf4") pod "228abe7a-c73b-42db-8162-7a3b424eef42" (UID: "228abe7a-c73b-42db-8162-7a3b424eef42"). InnerVolumeSpecName "kube-api-access-xczf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.422258 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/228abe7a-c73b-42db-8162-7a3b424eef42-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "228abe7a-c73b-42db-8162-7a3b424eef42" (UID: "228abe7a-c73b-42db-8162-7a3b424eef42"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.517969 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/228abe7a-c73b-42db-8162-7a3b424eef42-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.518030 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xczf4\" (UniqueName: \"kubernetes.io/projected/228abe7a-c73b-42db-8162-7a3b424eef42-kube-api-access-xczf4\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.518049 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.518093 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.518105 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ac97aea2-7502-4a8e-a2da-305c88bd2f3a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.518117 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/228abe7a-c73b-42db-8162-7a3b424eef42-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:26 crc kubenswrapper[4695]: I0320 10:58:26.895168 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef41a3d-8fcb-4013-8a20-2e9e39e152aa" path="/var/lib/kubelet/pods/7ef41a3d-8fcb-4013-8a20-2e9e39e152aa/volumes" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.119762 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk"] Mar 20 10:58:27 crc kubenswrapper[4695]: E0320 10:58:27.121033 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="228abe7a-c73b-42db-8162-7a3b424eef42" containerName="controller-manager" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.121059 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="228abe7a-c73b-42db-8162-7a3b424eef42" containerName="controller-manager" Mar 20 10:58:27 crc kubenswrapper[4695]: E0320 10:58:27.121088 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac97aea2-7502-4a8e-a2da-305c88bd2f3a" containerName="pruner" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.121097 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac97aea2-7502-4a8e-a2da-305c88bd2f3a" containerName="pruner" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.121218 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac97aea2-7502-4a8e-a2da-305c88bd2f3a" containerName="pruner" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.121233 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="228abe7a-c73b-42db-8162-7a3b424eef42" containerName="controller-manager" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.125114 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.129115 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk"] Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.154373 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" event={"ID":"228abe7a-c73b-42db-8162-7a3b424eef42","Type":"ContainerDied","Data":"0d360bd6624916fcdf400bb43639faef4b58267a0e05db440710de7939ae7e38"} Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.154477 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-54666897bb-f56tw" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.157611 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"ac97aea2-7502-4a8e-a2da-305c88bd2f3a","Type":"ContainerDied","Data":"41b1900b01ad99043c9d53822cbed5833950341369db4c2f3bf3d13b3f3227d3"} Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.157672 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41b1900b01ad99043c9d53822cbed5833950341369db4c2f3bf3d13b3f3227d3" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.157787 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.180896 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-54666897bb-f56tw"] Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.186114 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-54666897bb-f56tw"] Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.228924 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-client-ca\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.229030 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-config\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.229090 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-proxy-ca-bundles\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.229122 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8wx9\" (UniqueName: \"kubernetes.io/projected/b98ec6b2-499d-49cb-828d-2bcd0cb18929-kube-api-access-r8wx9\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.229380 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98ec6b2-499d-49cb-828d-2bcd0cb18929-serving-cert\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.268250 4695 scope.go:117] "RemoveContainer" containerID="b47f0210f305b7ccf33acac8e11734b43f0c00a76d7232fe0878241e41b50cc1" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.330564 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-config\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.330683 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-proxy-ca-bundles\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.330735 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8wx9\" (UniqueName: \"kubernetes.io/projected/b98ec6b2-499d-49cb-828d-2bcd0cb18929-kube-api-access-r8wx9\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.330822 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98ec6b2-499d-49cb-828d-2bcd0cb18929-serving-cert\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.330854 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-client-ca\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.332415 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-client-ca\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.332811 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-proxy-ca-bundles\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.332926 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-config\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.336360 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98ec6b2-499d-49cb-828d-2bcd0cb18929-serving-cert\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.352722 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8wx9\" (UniqueName: \"kubernetes.io/projected/b98ec6b2-499d-49cb-828d-2bcd0cb18929-kube-api-access-r8wx9\") pod \"controller-manager-6655c6bcfd-p2ftk\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:27 crc kubenswrapper[4695]: I0320 10:58:27.446242 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:28 crc kubenswrapper[4695]: I0320 10:58:28.895284 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="228abe7a-c73b-42db-8162-7a3b424eef42" path="/var/lib/kubelet/pods/228abe7a-c73b-42db-8162-7a3b424eef42/volumes" Mar 20 10:58:29 crc kubenswrapper[4695]: I0320 10:58:29.228265 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:29 crc kubenswrapper[4695]: I0320 10:58:29.228348 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:29 crc kubenswrapper[4695]: I0320 10:58:29.228384 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:29 crc kubenswrapper[4695]: I0320 10:58:29.228468 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:30 crc kubenswrapper[4695]: I0320 10:58:30.398846 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6"] Mar 20 10:58:30 crc kubenswrapper[4695]: I0320 10:58:30.908515 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk"] Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.196034 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerStarted","Data":"65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8"} Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.198550 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" event={"ID":"887ffd02-2073-4750-b7ff-40d9f4c3dc93","Type":"ContainerStarted","Data":"ad0d1b83613ecba880f4294d3e784311f406ebe6c6e3a672b42cc44f400d70bf"} Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.198602 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" event={"ID":"887ffd02-2073-4750-b7ff-40d9f4c3dc93","Type":"ContainerStarted","Data":"3cbe561d7516e0ea576dcda15d62862640bddc32a0a2e89ce02a3415c2820b94"} Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.200764 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.202382 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-t45k7" event={"ID":"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5","Type":"ContainerStarted","Data":"da190adf44ad8ab38b9bfd7c46ac1ac8c52e4a985325a6af23379c5273f634d5"} Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.204449 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerStarted","Data":"489bdf505868738ef5d106284c55cc6da8b8deabd146e53f91dca8da7d9958a6"} Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.205767 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" event={"ID":"e3280710-4b29-43c1-8f83-4c18670a9e0a","Type":"ContainerStarted","Data":"46b2e745e6b9b201ca52ce5404c6ab2af4f5a866aed62740ab55e0a9e5f394ed"} Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.233416 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.260954 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" podStartSLOduration=13.260926735 podStartE2EDuration="13.260926735s" podCreationTimestamp="2026-03-20 10:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:31.256617696 +0000 UTC m=+289.037223259" watchObservedRunningTime="2026-03-20 10:58:31.260926735 +0000 UTC m=+289.041532298" Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.290287 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" podStartSLOduration=22.13342643 podStartE2EDuration="31.290257623s" podCreationTimestamp="2026-03-20 10:58:00 +0000 UTC" firstStartedPulling="2026-03-20 10:58:20.918447992 +0000 UTC m=+278.699053555" lastFinishedPulling="2026-03-20 10:58:30.075279185 +0000 UTC m=+287.855884748" observedRunningTime="2026-03-20 10:58:31.286356025 +0000 UTC m=+289.066961588" watchObservedRunningTime="2026-03-20 10:58:31.290257623 +0000 UTC m=+289.070863186" Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.563451 4695 csr.go:261] certificate signing request csr-9xd7z is approved, waiting to be issued Mar 20 10:58:31 crc kubenswrapper[4695]: I0320 10:58:31.576971 4695 csr.go:257] certificate signing request csr-9xd7z is issued Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.230545 4695 generic.go:334] "Generic (PLEG): container finished" podID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerID="489bdf505868738ef5d106284c55cc6da8b8deabd146e53f91dca8da7d9958a6" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.230657 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerDied","Data":"489bdf505868738ef5d106284c55cc6da8b8deabd146e53f91dca8da7d9958a6"} Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.235697 4695 generic.go:334] "Generic (PLEG): container finished" podID="e3280710-4b29-43c1-8f83-4c18670a9e0a" containerID="46b2e745e6b9b201ca52ce5404c6ab2af4f5a866aed62740ab55e0a9e5f394ed" exitCode=0 Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.236496 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" event={"ID":"e3280710-4b29-43c1-8f83-4c18670a9e0a","Type":"ContainerDied","Data":"46b2e745e6b9b201ca52ce5404c6ab2af4f5a866aed62740ab55e0a9e5f394ed"} Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.283579 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566736-t45k7" podStartSLOduration=74.225647714 podStartE2EDuration="2m32.2835453s" podCreationTimestamp="2026-03-20 10:56:00 +0000 UTC" firstStartedPulling="2026-03-20 10:57:12.346765471 +0000 UTC m=+210.127371034" lastFinishedPulling="2026-03-20 10:58:30.404663057 +0000 UTC m=+288.185268620" observedRunningTime="2026-03-20 10:58:32.275340363 +0000 UTC m=+290.055945946" watchObservedRunningTime="2026-03-20 10:58:32.2835453 +0000 UTC m=+290.064150863" Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.579303 4695 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 15:16:27.760547609 +0000 UTC Mar 20 10:58:32 crc kubenswrapper[4695]: I0320 10:58:32.579362 4695 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6892h17m55.181189182s for next certificate rotation Mar 20 10:58:33 crc kubenswrapper[4695]: I0320 10:58:33.243134 4695 generic.go:334] "Generic (PLEG): container finished" podID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" containerID="da190adf44ad8ab38b9bfd7c46ac1ac8c52e4a985325a6af23379c5273f634d5" exitCode=0 Mar 20 10:58:33 crc kubenswrapper[4695]: I0320 10:58:33.244794 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-t45k7" event={"ID":"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5","Type":"ContainerDied","Data":"da190adf44ad8ab38b9bfd7c46ac1ac8c52e4a985325a6af23379c5273f634d5"} Mar 20 10:58:33 crc kubenswrapper[4695]: I0320 10:58:33.334666 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" podUID="c5f41a94-dc5d-4026-983e-52e817217252" containerName="oauth-openshift" containerID="cri-o://537c70a1f9d4c6e8600ac402a58bc5f2eab15e5bd2588dba31a8908ee46319f1" gracePeriod=15 Mar 20 10:58:33 crc kubenswrapper[4695]: I0320 10:58:33.591545 4695 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-20 22:20:09.462249915 +0000 UTC Mar 20 10:58:33 crc kubenswrapper[4695]: I0320 10:58:33.591645 4695 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6611h21m35.870609122s for next certificate rotation Mar 20 10:58:34 crc kubenswrapper[4695]: I0320 10:58:34.254417 4695 generic.go:334] "Generic (PLEG): container finished" podID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerID="65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8" exitCode=0 Mar 20 10:58:34 crc kubenswrapper[4695]: I0320 10:58:34.254526 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerDied","Data":"65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8"} Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.291422 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.310678 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.340256 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dvbm\" (UniqueName: \"kubernetes.io/projected/e3280710-4b29-43c1-8f83-4c18670a9e0a-kube-api-access-9dvbm\") pod \"e3280710-4b29-43c1-8f83-4c18670a9e0a\" (UID: \"e3280710-4b29-43c1-8f83-4c18670a9e0a\") " Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.341160 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlw29\" (UniqueName: \"kubernetes.io/projected/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5-kube-api-access-wlw29\") pod \"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5\" (UID: \"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5\") " Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.345851 4695 generic.go:334] "Generic (PLEG): container finished" podID="c5f41a94-dc5d-4026-983e-52e817217252" containerID="537c70a1f9d4c6e8600ac402a58bc5f2eab15e5bd2588dba31a8908ee46319f1" exitCode=0 Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.345940 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" event={"ID":"c5f41a94-dc5d-4026-983e-52e817217252","Type":"ContainerDied","Data":"537c70a1f9d4c6e8600ac402a58bc5f2eab15e5bd2588dba31a8908ee46319f1"} Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.347429 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566736-t45k7" event={"ID":"b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5","Type":"ContainerDied","Data":"935a52373d357ebf12ed350f2e06ac50e4696b9ad52c70285c0f9c18ce60a225"} Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.347484 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="935a52373d357ebf12ed350f2e06ac50e4696b9ad52c70285c0f9c18ce60a225" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.347525 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566736-t45k7" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.347938 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3280710-4b29-43c1-8f83-4c18670a9e0a-kube-api-access-9dvbm" (OuterVolumeSpecName: "kube-api-access-9dvbm") pod "e3280710-4b29-43c1-8f83-4c18670a9e0a" (UID: "e3280710-4b29-43c1-8f83-4c18670a9e0a"). InnerVolumeSpecName "kube-api-access-9dvbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.348560 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" event={"ID":"b98ec6b2-499d-49cb-828d-2bcd0cb18929","Type":"ContainerStarted","Data":"745b99654556310cb4e2670026b5e48c21d88e36ebfe6afc404e04d00d7e0561"} Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.349716 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" event={"ID":"e3280710-4b29-43c1-8f83-4c18670a9e0a","Type":"ContainerDied","Data":"93586f846b4b452fe26bde53ddc06e5d78f06aafe418eda7d2f44947d95ca84d"} Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.349743 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93586f846b4b452fe26bde53ddc06e5d78f06aafe418eda7d2f44947d95ca84d" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.349802 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-dkbr4" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.350902 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5-kube-api-access-wlw29" (OuterVolumeSpecName: "kube-api-access-wlw29") pod "b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" (UID: "b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5"). InnerVolumeSpecName "kube-api-access-wlw29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.443618 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dvbm\" (UniqueName: \"kubernetes.io/projected/e3280710-4b29-43c1-8f83-4c18670a9e0a-kube-api-access-9dvbm\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:35 crc kubenswrapper[4695]: I0320 10:58:35.443658 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlw29\" (UniqueName: \"kubernetes.io/projected/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5-kube-api-access-wlw29\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.033828 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056069 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-idp-0-file-data\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056125 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdx85\" (UniqueName: \"kubernetes.io/projected/c5f41a94-dc5d-4026-983e-52e817217252-kube-api-access-cdx85\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056144 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-cliconfig\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056168 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-provider-selection\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056190 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-error\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056219 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-session\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056238 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5f41a94-dc5d-4026-983e-52e817217252-audit-dir\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056261 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-trusted-ca-bundle\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056310 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-service-ca\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056347 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-ocp-branding-template\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056378 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-serving-cert\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056405 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-router-certs\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056433 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-audit-policies\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.056458 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-login\") pod \"c5f41a94-dc5d-4026-983e-52e817217252\" (UID: \"c5f41a94-dc5d-4026-983e-52e817217252\") " Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.060480 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c5f41a94-dc5d-4026-983e-52e817217252-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.062100 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.062826 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.062993 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.063169 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.066679 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5f41a94-dc5d-4026-983e-52e817217252-kube-api-access-cdx85" (OuterVolumeSpecName: "kube-api-access-cdx85") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "kube-api-access-cdx85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.066698 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.067669 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.067834 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.067990 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.068245 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.068395 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.068862 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.070116 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "c5f41a94-dc5d-4026-983e-52e817217252" (UID: "c5f41a94-dc5d-4026-983e-52e817217252"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158769 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158828 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158846 4695 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158860 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158877 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158889 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdx85\" (UniqueName: \"kubernetes.io/projected/c5f41a94-dc5d-4026-983e-52e817217252-kube-api-access-cdx85\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158901 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158934 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.158948 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.159094 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.159113 4695 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c5f41a94-dc5d-4026-983e-52e817217252-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.159126 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.159142 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.159156 4695 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/c5f41a94-dc5d-4026-983e-52e817217252-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.363068 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" event={"ID":"c5f41a94-dc5d-4026-983e-52e817217252","Type":"ContainerDied","Data":"920200843c0ace85218d42c99b2d107b41cd8f9b04aebc1e2dde089e6cbc2888"} Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.363140 4695 scope.go:117] "RemoveContainer" containerID="537c70a1f9d4c6e8600ac402a58bc5f2eab15e5bd2588dba31a8908ee46319f1" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.363313 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r82b4" Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.404158 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r82b4"] Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.407354 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r82b4"] Mar 20 10:58:36 crc kubenswrapper[4695]: I0320 10:58:36.983841 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5f41a94-dc5d-4026-983e-52e817217252" path="/var/lib/kubelet/pods/c5f41a94-dc5d-4026-983e-52e817217252/volumes" Mar 20 10:58:37 crc kubenswrapper[4695]: I0320 10:58:37.371020 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" event={"ID":"b98ec6b2-499d-49cb-828d-2bcd0cb18929","Type":"ContainerStarted","Data":"a6d286f0523ebfc0b438c6214f1b2d22963dc55294556a9ceb2393f5abf5571b"} Mar 20 10:58:37 crc kubenswrapper[4695]: I0320 10:58:37.372381 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:37 crc kubenswrapper[4695]: I0320 10:58:37.374887 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerStarted","Data":"3aabcce0ae04970822f581437e36f97e5858515e319abee4128f5097a8fcc374"} Mar 20 10:58:37 crc kubenswrapper[4695]: I0320 10:58:37.429668 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" podStartSLOduration=19.429644678 podStartE2EDuration="19.429644678s" podCreationTimestamp="2026-03-20 10:58:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:37.406838624 +0000 UTC m=+295.187444187" watchObservedRunningTime="2026-03-20 10:58:37.429644678 +0000 UTC m=+295.210250231" Mar 20 10:58:37 crc kubenswrapper[4695]: I0320 10:58:37.510484 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:38 crc kubenswrapper[4695]: I0320 10:58:38.764261 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk"] Mar 20 10:58:38 crc kubenswrapper[4695]: I0320 10:58:38.875896 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6"] Mar 20 10:58:38 crc kubenswrapper[4695]: I0320 10:58:38.876583 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" podUID="887ffd02-2073-4750-b7ff-40d9f4c3dc93" containerName="route-controller-manager" containerID="cri-o://ad0d1b83613ecba880f4294d3e784311f406ebe6c6e3a672b42cc44f400d70bf" gracePeriod=30 Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.229559 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.229619 4695 patch_prober.go:28] interesting pod/downloads-7954f5f757-dzwsk container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.229643 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.229735 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-dzwsk" podUID="acc0b6e9-5cb1-4bf5-b66d-abb48a7a1564" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.526204 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerStarted","Data":"e3e9fa14ffce9bf1b0980fe833d05fa119c8c722b051c1703109c5717ad5f9fe"} Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.529681 4695 generic.go:334] "Generic (PLEG): container finished" podID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerID="3aabcce0ae04970822f581437e36f97e5858515e319abee4128f5097a8fcc374" exitCode=0 Mar 20 10:58:39 crc kubenswrapper[4695]: I0320 10:58:39.529878 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerDied","Data":"3aabcce0ae04970822f581437e36f97e5858515e319abee4128f5097a8fcc374"} Mar 20 10:58:40 crc kubenswrapper[4695]: I0320 10:58:40.541011 4695 generic.go:334] "Generic (PLEG): container finished" podID="887ffd02-2073-4750-b7ff-40d9f4c3dc93" containerID="ad0d1b83613ecba880f4294d3e784311f406ebe6c6e3a672b42cc44f400d70bf" exitCode=0 Mar 20 10:58:40 crc kubenswrapper[4695]: I0320 10:58:40.541115 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" event={"ID":"887ffd02-2073-4750-b7ff-40d9f4c3dc93","Type":"ContainerDied","Data":"ad0d1b83613ecba880f4294d3e784311f406ebe6c6e3a672b42cc44f400d70bf"} Mar 20 10:58:40 crc kubenswrapper[4695]: I0320 10:58:40.541813 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" podUID="b98ec6b2-499d-49cb-828d-2bcd0cb18929" containerName="controller-manager" containerID="cri-o://a6d286f0523ebfc0b438c6214f1b2d22963dc55294556a9ceb2393f5abf5571b" gracePeriod=30 Mar 20 10:58:40 crc kubenswrapper[4695]: I0320 10:58:40.571695 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-28xd8" podStartSLOduration=5.809827973 podStartE2EDuration="1m22.571677963s" podCreationTimestamp="2026-03-20 10:57:18 +0000 UTC" firstStartedPulling="2026-03-20 10:57:21.851590482 +0000 UTC m=+219.632196045" lastFinishedPulling="2026-03-20 10:58:38.613440472 +0000 UTC m=+296.394046035" observedRunningTime="2026-03-20 10:58:40.568449851 +0000 UTC m=+298.349055404" watchObservedRunningTime="2026-03-20 10:58:40.571677963 +0000 UTC m=+298.352283526" Mar 20 10:58:40 crc kubenswrapper[4695]: I0320 10:58:40.989795 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.039138 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h"] Mar 20 10:58:41 crc kubenswrapper[4695]: E0320 10:58:41.039738 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5f41a94-dc5d-4026-983e-52e817217252" containerName="oauth-openshift" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.039770 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5f41a94-dc5d-4026-983e-52e817217252" containerName="oauth-openshift" Mar 20 10:58:41 crc kubenswrapper[4695]: E0320 10:58:41.039794 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" containerName="oc" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.039814 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" containerName="oc" Mar 20 10:58:41 crc kubenswrapper[4695]: E0320 10:58:41.039851 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887ffd02-2073-4750-b7ff-40d9f4c3dc93" containerName="route-controller-manager" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.039862 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="887ffd02-2073-4750-b7ff-40d9f4c3dc93" containerName="route-controller-manager" Mar 20 10:58:41 crc kubenswrapper[4695]: E0320 10:58:41.039878 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3280710-4b29-43c1-8f83-4c18670a9e0a" containerName="oc" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.039888 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3280710-4b29-43c1-8f83-4c18670a9e0a" containerName="oc" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.040085 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" containerName="oc" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.040110 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5f41a94-dc5d-4026-983e-52e817217252" containerName="oauth-openshift" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.040123 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="887ffd02-2073-4750-b7ff-40d9f4c3dc93" containerName="route-controller-manager" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.040138 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3280710-4b29-43c1-8f83-4c18670a9e0a" containerName="oc" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.040743 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.048556 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h"] Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133107 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr4xk\" (UniqueName: \"kubernetes.io/projected/887ffd02-2073-4750-b7ff-40d9f4c3dc93-kube-api-access-hr4xk\") pod \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133171 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-config\") pod \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133204 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/887ffd02-2073-4750-b7ff-40d9f4c3dc93-serving-cert\") pod \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133283 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-client-ca\") pod \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\" (UID: \"887ffd02-2073-4750-b7ff-40d9f4c3dc93\") " Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133362 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eabf108b-6be6-45b9-a150-4623727fca50-client-ca\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133425 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eabf108b-6be6-45b9-a150-4623727fca50-config\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133640 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdr5\" (UniqueName: \"kubernetes.io/projected/eabf108b-6be6-45b9-a150-4623727fca50-kube-api-access-pwdr5\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.133700 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eabf108b-6be6-45b9-a150-4623727fca50-serving-cert\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.134960 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-config" (OuterVolumeSpecName: "config") pod "887ffd02-2073-4750-b7ff-40d9f4c3dc93" (UID: "887ffd02-2073-4750-b7ff-40d9f4c3dc93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.135014 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-client-ca" (OuterVolumeSpecName: "client-ca") pod "887ffd02-2073-4750-b7ff-40d9f4c3dc93" (UID: "887ffd02-2073-4750-b7ff-40d9f4c3dc93"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.140800 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/887ffd02-2073-4750-b7ff-40d9f4c3dc93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "887ffd02-2073-4750-b7ff-40d9f4c3dc93" (UID: "887ffd02-2073-4750-b7ff-40d9f4c3dc93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.141491 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887ffd02-2073-4750-b7ff-40d9f4c3dc93-kube-api-access-hr4xk" (OuterVolumeSpecName: "kube-api-access-hr4xk") pod "887ffd02-2073-4750-b7ff-40d9f4c3dc93" (UID: "887ffd02-2073-4750-b7ff-40d9f4c3dc93"). InnerVolumeSpecName "kube-api-access-hr4xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235012 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eabf108b-6be6-45b9-a150-4623727fca50-client-ca\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235130 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eabf108b-6be6-45b9-a150-4623727fca50-config\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235156 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdr5\" (UniqueName: \"kubernetes.io/projected/eabf108b-6be6-45b9-a150-4623727fca50-kube-api-access-pwdr5\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235232 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eabf108b-6be6-45b9-a150-4623727fca50-serving-cert\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235314 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/887ffd02-2073-4750-b7ff-40d9f4c3dc93-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235331 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235346 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr4xk\" (UniqueName: \"kubernetes.io/projected/887ffd02-2073-4750-b7ff-40d9f4c3dc93-kube-api-access-hr4xk\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.235359 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/887ffd02-2073-4750-b7ff-40d9f4c3dc93-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.236446 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eabf108b-6be6-45b9-a150-4623727fca50-client-ca\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.236884 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eabf108b-6be6-45b9-a150-4623727fca50-config\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.239507 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eabf108b-6be6-45b9-a150-4623727fca50-serving-cert\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.254328 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdr5\" (UniqueName: \"kubernetes.io/projected/eabf108b-6be6-45b9-a150-4623727fca50-kube-api-access-pwdr5\") pod \"route-controller-manager-6d59c4667b-crj8h\" (UID: \"eabf108b-6be6-45b9-a150-4623727fca50\") " pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.360842 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.549145 4695 generic.go:334] "Generic (PLEG): container finished" podID="b98ec6b2-499d-49cb-828d-2bcd0cb18929" containerID="a6d286f0523ebfc0b438c6214f1b2d22963dc55294556a9ceb2393f5abf5571b" exitCode=0 Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.549214 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" event={"ID":"b98ec6b2-499d-49cb-828d-2bcd0cb18929","Type":"ContainerDied","Data":"a6d286f0523ebfc0b438c6214f1b2d22963dc55294556a9ceb2393f5abf5571b"} Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.550624 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" event={"ID":"887ffd02-2073-4750-b7ff-40d9f4c3dc93","Type":"ContainerDied","Data":"3cbe561d7516e0ea576dcda15d62862640bddc32a0a2e89ce02a3415c2820b94"} Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.550660 4695 scope.go:117] "RemoveContainer" containerID="ad0d1b83613ecba880f4294d3e784311f406ebe6c6e3a672b42cc44f400d70bf" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.550678 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6" Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.587024 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6"] Mar 20 10:58:41 crc kubenswrapper[4695]: I0320 10:58:41.590264 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-594999bcf4-f85c6"] Mar 20 10:58:42 crc kubenswrapper[4695]: I0320 10:58:42.895290 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887ffd02-2073-4750-b7ff-40d9f4c3dc93" path="/var/lib/kubelet/pods/887ffd02-2073-4750-b7ff-40d9f4c3dc93/volumes" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.503151 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx"] Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.505160 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.510055 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.510309 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.511160 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.511495 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.511552 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.511656 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.511871 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.515535 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.517643 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.517687 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.517957 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.518594 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.525032 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.528142 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.535345 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx"] Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.537264 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.571841 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-login\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.571881 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.571941 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572111 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572169 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-audit-policies\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572194 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-audit-dir\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572225 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572265 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572303 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572371 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbzn6\" (UniqueName: \"kubernetes.io/projected/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-kube-api-access-vbzn6\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572486 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-session\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572555 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-error\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572588 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.572614 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.678802 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-session\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.678935 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-error\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.678991 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679022 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679056 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-login\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679081 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679125 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679186 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679212 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-audit-policies\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679238 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-audit-dir\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679267 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679295 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679333 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.679373 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbzn6\" (UniqueName: \"kubernetes.io/projected/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-kube-api-access-vbzn6\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.680795 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.681102 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.681269 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-audit-dir\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.681510 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-audit-policies\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.682067 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.692229 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.695933 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.698323 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-login\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.698647 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-session\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.699773 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.704237 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.716309 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-error\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.723137 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.723635 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbzn6\" (UniqueName: \"kubernetes.io/projected/a798f826-a3bf-4af3-abbb-eef3b1a77fd7-kube-api-access-vbzn6\") pod \"oauth-openshift-5fd878c9cf-ff5kx\" (UID: \"a798f826-a3bf-4af3-abbb-eef3b1a77fd7\") " pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:43 crc kubenswrapper[4695]: I0320 10:58:43.831116 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.372878 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.412083 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fd986d555-bc6rq"] Mar 20 10:58:45 crc kubenswrapper[4695]: E0320 10:58:45.412341 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b98ec6b2-499d-49cb-828d-2bcd0cb18929" containerName="controller-manager" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.412353 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b98ec6b2-499d-49cb-828d-2bcd0cb18929" containerName="controller-manager" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.412485 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b98ec6b2-499d-49cb-828d-2bcd0cb18929" containerName="controller-manager" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.412877 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.476546 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fd986d555-bc6rq"] Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.504613 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-config\") pod \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.504698 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-proxy-ca-bundles\") pod \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.504784 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8wx9\" (UniqueName: \"kubernetes.io/projected/b98ec6b2-499d-49cb-828d-2bcd0cb18929-kube-api-access-r8wx9\") pod \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.504865 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98ec6b2-499d-49cb-828d-2bcd0cb18929-serving-cert\") pod \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.504976 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-client-ca\") pod \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\" (UID: \"b98ec6b2-499d-49cb-828d-2bcd0cb18929\") " Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.505186 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-config\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.505234 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-proxy-ca-bundles\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.505272 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-client-ca\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.505348 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c34e360-2b0a-43ea-b447-1cbec6c88848-serving-cert\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.505383 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvbpg\" (UniqueName: \"kubernetes.io/projected/7c34e360-2b0a-43ea-b447-1cbec6c88848-kube-api-access-tvbpg\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.506539 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b98ec6b2-499d-49cb-828d-2bcd0cb18929" (UID: "b98ec6b2-499d-49cb-828d-2bcd0cb18929"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.507205 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-config" (OuterVolumeSpecName: "config") pod "b98ec6b2-499d-49cb-828d-2bcd0cb18929" (UID: "b98ec6b2-499d-49cb-828d-2bcd0cb18929"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.507739 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-client-ca" (OuterVolumeSpecName: "client-ca") pod "b98ec6b2-499d-49cb-828d-2bcd0cb18929" (UID: "b98ec6b2-499d-49cb-828d-2bcd0cb18929"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.509097 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b98ec6b2-499d-49cb-828d-2bcd0cb18929-kube-api-access-r8wx9" (OuterVolumeSpecName: "kube-api-access-r8wx9") pod "b98ec6b2-499d-49cb-828d-2bcd0cb18929" (UID: "b98ec6b2-499d-49cb-828d-2bcd0cb18929"). InnerVolumeSpecName "kube-api-access-r8wx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.513143 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b98ec6b2-499d-49cb-828d-2bcd0cb18929-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b98ec6b2-499d-49cb-828d-2bcd0cb18929" (UID: "b98ec6b2-499d-49cb-828d-2bcd0cb18929"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.576501 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" event={"ID":"b98ec6b2-499d-49cb-828d-2bcd0cb18929","Type":"ContainerDied","Data":"745b99654556310cb4e2670026b5e48c21d88e36ebfe6afc404e04d00d7e0561"} Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.576583 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.606629 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk"] Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.607610 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-client-ca\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.607795 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c34e360-2b0a-43ea-b447-1cbec6c88848-serving-cert\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.607871 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvbpg\" (UniqueName: \"kubernetes.io/projected/7c34e360-2b0a-43ea-b447-1cbec6c88848-kube-api-access-tvbpg\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.607950 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-config\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608017 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-proxy-ca-bundles\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608111 4695 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608132 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8wx9\" (UniqueName: \"kubernetes.io/projected/b98ec6b2-499d-49cb-828d-2bcd0cb18929-kube-api-access-r8wx9\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608170 4695 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b98ec6b2-499d-49cb-828d-2bcd0cb18929-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608187 4695 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608199 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b98ec6b2-499d-49cb-828d-2bcd0cb18929-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.608697 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-client-ca\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.609766 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-proxy-ca-bundles\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.609790 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c34e360-2b0a-43ea-b447-1cbec6c88848-config\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.614933 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c34e360-2b0a-43ea-b447-1cbec6c88848-serving-cert\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.615777 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6655c6bcfd-p2ftk"] Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.628331 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvbpg\" (UniqueName: \"kubernetes.io/projected/7c34e360-2b0a-43ea-b447-1cbec6c88848-kube-api-access-tvbpg\") pod \"controller-manager-6fd986d555-bc6rq\" (UID: \"7c34e360-2b0a-43ea-b447-1cbec6c88848\") " pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:45 crc kubenswrapper[4695]: I0320 10:58:45.736408 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:46 crc kubenswrapper[4695]: I0320 10:58:46.895425 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b98ec6b2-499d-49cb-828d-2bcd0cb18929" path="/var/lib/kubelet/pods/b98ec6b2-499d-49cb-828d-2bcd0cb18929/volumes" Mar 20 10:58:46 crc kubenswrapper[4695]: I0320 10:58:46.967964 4695 scope.go:117] "RemoveContainer" containerID="a6d286f0523ebfc0b438c6214f1b2d22963dc55294556a9ceb2393f5abf5571b" Mar 20 10:58:49 crc kubenswrapper[4695]: I0320 10:58:49.235159 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-dzwsk" Mar 20 10:58:49 crc kubenswrapper[4695]: I0320 10:58:49.284124 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:58:49 crc kubenswrapper[4695]: I0320 10:58:49.284177 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:58:50 crc kubenswrapper[4695]: I0320 10:58:50.090235 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:58:50 crc kubenswrapper[4695]: I0320 10:58:50.137106 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 10:58:53 crc kubenswrapper[4695]: I0320 10:58:53.161738 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h"] Mar 20 10:58:53 crc kubenswrapper[4695]: I0320 10:58:53.392876 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx"] Mar 20 10:58:53 crc kubenswrapper[4695]: I0320 10:58:53.615044 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fd986d555-bc6rq"] Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.076018 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" event={"ID":"a798f826-a3bf-4af3-abbb-eef3b1a77fd7","Type":"ContainerStarted","Data":"25dbb421435555da702b375905a8bdf32570084cbf3a00da85c2a13207790e7f"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.100762 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerStarted","Data":"a385b8dfa2b4cc8c33beb2702f2c9eb8e77c3930cc53a97e0a9560c3a40f0c2d"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.164516 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerStarted","Data":"050a44b4542238f8ce92766bf19917b2a2456b70bd029b400da70c7736b4ac32"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.275275 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" event={"ID":"eabf108b-6be6-45b9-a150-4623727fca50","Type":"ContainerStarted","Data":"bc1983ce608e041fadb71c68b861261e04d000183ed7d3b50116bf96fc89c472"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.281460 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerStarted","Data":"39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.285246 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerStarted","Data":"913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.287553 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerStarted","Data":"993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.323020 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerStarted","Data":"4fb592d7ee5f03ec779af1f27c3247e880a518de5546ad8b20d95bc5c009e608"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.325171 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerStarted","Data":"67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a"} Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.724430 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6fqn2" podStartSLOduration=5.883170084 podStartE2EDuration="1m36.724409666s" podCreationTimestamp="2026-03-20 10:57:18 +0000 UTC" firstStartedPulling="2026-03-20 10:57:21.722169268 +0000 UTC m=+219.502774831" lastFinishedPulling="2026-03-20 10:58:52.56340885 +0000 UTC m=+310.344014413" observedRunningTime="2026-03-20 10:58:54.723191505 +0000 UTC m=+312.503797068" watchObservedRunningTime="2026-03-20 10:58:54.724409666 +0000 UTC m=+312.505015239" Mar 20 10:58:54 crc kubenswrapper[4695]: I0320 10:58:54.757839 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kt9vc" podStartSLOduration=5.639290464 podStartE2EDuration="1m33.757787396s" podCreationTimestamp="2026-03-20 10:57:21 +0000 UTC" firstStartedPulling="2026-03-20 10:57:24.300317808 +0000 UTC m=+222.080923371" lastFinishedPulling="2026-03-20 10:58:52.41881474 +0000 UTC m=+310.199420303" observedRunningTime="2026-03-20 10:58:54.754072483 +0000 UTC m=+312.534678056" watchObservedRunningTime="2026-03-20 10:58:54.757787396 +0000 UTC m=+312.538392959" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.433644 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" event={"ID":"7c34e360-2b0a-43ea-b447-1cbec6c88848","Type":"ContainerStarted","Data":"c624d57c9639c6812ba39a1aa09f64c9af13db644066c54c7eb623a6cfdc5759"} Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.433736 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" event={"ID":"7c34e360-2b0a-43ea-b447-1cbec6c88848","Type":"ContainerStarted","Data":"5d52d0ec87b683bb5ba46b5bdcfcaab689e4d0ed944c6b86de8082044b2f88cf"} Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.434098 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.436852 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" event={"ID":"eabf108b-6be6-45b9-a150-4623727fca50","Type":"ContainerStarted","Data":"2261a989d73f06bdb30dbf4e5d60ff57bafcfdcf5b7cad6c9e64dc35b06bb864"} Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.436942 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.437074 4695 patch_prober.go:28] interesting pod/controller-manager-6fd986d555-bc6rq container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.437140 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" podUID="7c34e360-2b0a-43ea-b447-1cbec6c88848" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.438267 4695 patch_prober.go:28] interesting pod/route-controller-manager-6d59c4667b-crj8h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.438296 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" podUID="eabf108b-6be6-45b9-a150-4623727fca50" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.441344 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" event={"ID":"a798f826-a3bf-4af3-abbb-eef3b1a77fd7","Type":"ContainerStarted","Data":"865b9e5ef0e001efe327df85edbe3e4b69cd5623f8e511c8398ed86550288519"} Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.442587 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.465595 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" podStartSLOduration=17.465569726 podStartE2EDuration="17.465569726s" podCreationTimestamp="2026-03-20 10:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:55.460726554 +0000 UTC m=+313.241332127" watchObservedRunningTime="2026-03-20 10:58:55.465569726 +0000 UTC m=+313.246175299" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.541381 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" podStartSLOduration=47.541343043 podStartE2EDuration="47.541343043s" podCreationTimestamp="2026-03-20 10:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:55.493696914 +0000 UTC m=+313.274302477" watchObservedRunningTime="2026-03-20 10:58:55.541343043 +0000 UTC m=+313.321948606" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.832861 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fd986d555-bc6rq" Mar 20 10:58:55 crc kubenswrapper[4695]: I0320 10:58:55.861168 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" podStartSLOduration=17.861139745 podStartE2EDuration="17.861139745s" podCreationTimestamp="2026-03-20 10:58:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:55.581609367 +0000 UTC m=+313.362214930" watchObservedRunningTime="2026-03-20 10:58:55.861139745 +0000 UTC m=+313.641745308" Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.442463 4695 patch_prober.go:28] interesting pod/oauth-openshift-5fd878c9cf-ff5kx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.443196 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" podUID="a798f826-a3bf-4af3-abbb-eef3b1a77fd7" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.66:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.450738 4695 generic.go:334] "Generic (PLEG): container finished" podID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerID="67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a" exitCode=0 Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.450822 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerDied","Data":"67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a"} Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.467834 4695 generic.go:334] "Generic (PLEG): container finished" podID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerID="050a44b4542238f8ce92766bf19917b2a2456b70bd029b400da70c7736b4ac32" exitCode=0 Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.468772 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerDied","Data":"050a44b4542238f8ce92766bf19917b2a2456b70bd029b400da70c7736b4ac32"} Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.529448 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d59c4667b-crj8h" Mar 20 10:58:56 crc kubenswrapper[4695]: I0320 10:58:56.638043 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fd878c9cf-ff5kx" Mar 20 10:58:57 crc kubenswrapper[4695]: I0320 10:58:57.508713 4695 generic.go:334] "Generic (PLEG): container finished" podID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerID="a385b8dfa2b4cc8c33beb2702f2c9eb8e77c3930cc53a97e0a9560c3a40f0c2d" exitCode=0 Mar 20 10:58:57 crc kubenswrapper[4695]: I0320 10:58:57.508854 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerDied","Data":"a385b8dfa2b4cc8c33beb2702f2c9eb8e77c3930cc53a97e0a9560c3a40f0c2d"} Mar 20 10:58:57 crc kubenswrapper[4695]: I0320 10:58:57.521948 4695 generic.go:334] "Generic (PLEG): container finished" podID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerID="4fb592d7ee5f03ec779af1f27c3247e880a518de5546ad8b20d95bc5c009e608" exitCode=0 Mar 20 10:58:57 crc kubenswrapper[4695]: I0320 10:58:57.522833 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerDied","Data":"4fb592d7ee5f03ec779af1f27c3247e880a518de5546ad8b20d95bc5c009e608"} Mar 20 10:58:58 crc kubenswrapper[4695]: I0320 10:58:58.665167 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerStarted","Data":"a4d1fe52926db7a4a4059229f0e59594d06e2a09b3fca6a9eb924c27b8c410a4"} Mar 20 10:58:58 crc kubenswrapper[4695]: I0320 10:58:58.668885 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerStarted","Data":"7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29"} Mar 20 10:58:58 crc kubenswrapper[4695]: I0320 10:58:58.695788 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-p78dm" podStartSLOduration=5.515259869 podStartE2EDuration="1m38.695726198s" podCreationTimestamp="2026-03-20 10:57:20 +0000 UTC" firstStartedPulling="2026-03-20 10:57:24.221809505 +0000 UTC m=+222.002415078" lastFinishedPulling="2026-03-20 10:58:57.402275844 +0000 UTC m=+315.182881407" observedRunningTime="2026-03-20 10:58:58.69100728 +0000 UTC m=+316.471612853" watchObservedRunningTime="2026-03-20 10:58:58.695726198 +0000 UTC m=+316.476331761" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.129478 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.129560 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.680826 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerStarted","Data":"861dfc4058068d4c4f57d676e81bd038ec64ab10c904d822de8da01044c941df"} Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.684762 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerStarted","Data":"1d38591f7dfdb9068d761219345ed096498521f05df89266d75b9376745f3db4"} Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.794247 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6mw7h" podStartSLOduration=5.5042526899999995 podStartE2EDuration="1m39.794222564s" podCreationTimestamp="2026-03-20 10:57:20 +0000 UTC" firstStartedPulling="2026-03-20 10:57:23.126364294 +0000 UTC m=+220.906969857" lastFinishedPulling="2026-03-20 10:58:57.416334168 +0000 UTC m=+315.196939731" observedRunningTime="2026-03-20 10:58:58.724243746 +0000 UTC m=+316.504849309" watchObservedRunningTime="2026-03-20 10:58:59.794222564 +0000 UTC m=+317.574828127" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.829120 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shq4g" podStartSLOduration=4.545463306 podStartE2EDuration="1m40.829099192s" podCreationTimestamp="2026-03-20 10:57:19 +0000 UTC" firstStartedPulling="2026-03-20 10:57:21.814399794 +0000 UTC m=+219.595005357" lastFinishedPulling="2026-03-20 10:58:58.09803568 +0000 UTC m=+315.878641243" observedRunningTime="2026-03-20 10:58:59.799374924 +0000 UTC m=+317.579980507" watchObservedRunningTime="2026-03-20 10:58:59.829099192 +0000 UTC m=+317.609704755" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.831377 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mzrcb" podStartSLOduration=5.54153396 podStartE2EDuration="1m41.831364299s" podCreationTimestamp="2026-03-20 10:57:18 +0000 UTC" firstStartedPulling="2026-03-20 10:57:21.770200445 +0000 UTC m=+219.550806008" lastFinishedPulling="2026-03-20 10:58:58.060030784 +0000 UTC m=+315.840636347" observedRunningTime="2026-03-20 10:58:59.82861565 +0000 UTC m=+317.609221213" watchObservedRunningTime="2026-03-20 10:58:59.831364299 +0000 UTC m=+317.611969862" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.891496 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:58:59 crc kubenswrapper[4695]: I0320 10:58:59.891567 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:59:00 crc kubenswrapper[4695]: I0320 10:59:00.804471 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-6fqn2" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:00 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:00 crc kubenswrapper[4695]: > Mar 20 10:59:00 crc kubenswrapper[4695]: I0320 10:59:00.984632 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:59:00 crc kubenswrapper[4695]: I0320 10:59:00.984790 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.081214 4695 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.082624 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.123152 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.174838 4695 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.175258 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d" gracePeriod=15 Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.175366 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb" gracePeriod=15 Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.175425 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832" gracePeriod=15 Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.175522 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276" gracePeriod=15 Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.175403 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c" gracePeriod=15 Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180310 4695 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180673 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180702 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180717 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180726 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180741 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180749 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180758 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180765 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180778 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180811 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180822 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180830 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180845 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180853 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.180877 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.180885 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181073 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181091 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181102 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181113 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181127 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181139 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181150 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 10:59:01 crc kubenswrapper[4695]: E0320 10:59:01.181289 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181300 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.181431 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.210655 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.210725 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.210764 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.210801 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.210889 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315119 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315183 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315217 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315247 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315272 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315303 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315367 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315398 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315555 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315629 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315681 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315723 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.315790 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.320424 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.320493 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.416952 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.417073 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.417160 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.417168 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.417307 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.417359 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.420328 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.809888 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.815091 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.817427 4695 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832" exitCode=2 Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.818860 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e35f93d06c8159620a5244a626e0c4e98a2827dbc97a5513929d9f0efe812984"} Mar 20 10:59:01 crc kubenswrapper[4695]: I0320 10:59:01.838312 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-shq4g" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:01 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:01 crc kubenswrapper[4695]: > Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.039743 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-p78dm" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:02 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:02 crc kubenswrapper[4695]: > Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.335409 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.335521 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.385681 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-6mw7h" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:02 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:02 crc kubenswrapper[4695]: > Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.830855 4695 generic.go:334] "Generic (PLEG): container finished" podID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" containerID="2cd830d4896439d024a1e2a77b1f538d3fae2bebae9f4a05964503da39f2ef5c" exitCode=0 Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.830959 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a82b7512-ebb9-42df-aff3-e8a7077b0fcf","Type":"ContainerDied","Data":"2cd830d4896439d024a1e2a77b1f538d3fae2bebae9f4a05964503da39f2ef5c"} Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.832733 4695 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.832861 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147"} Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.833164 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.833821 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.835824 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.837865 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.840857 4695 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb" exitCode=0 Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.840926 4695 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276" exitCode=0 Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.840950 4695 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c" exitCode=0 Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.840978 4695 scope.go:117] "RemoveContainer" containerID="cdbd560b4a4d3c74481083ae89fe26260697acd200c7136a582658aac94806bc" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.844957 4695 generic.go:334] "Generic (PLEG): container finished" podID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerID="993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7" exitCode=0 Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.845016 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerDied","Data":"993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7"} Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.846964 4695 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.847735 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.848131 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.848413 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.890747 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.892186 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.892791 4695 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:02 crc kubenswrapper[4695]: I0320 10:59:02.893144 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.395837 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kt9vc" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:03 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:03 crc kubenswrapper[4695]: > Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.854111 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.856437 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.856939 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.857211 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.961777 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.962243 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.962296 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:59:03 crc kubenswrapper[4695]: I0320 10:59:03.962464 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:59:03 crc kubenswrapper[4695]: W0320 10:59:03.962713 4695 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:03 crc kubenswrapper[4695]: E0320 10:59:03.962808 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:03 crc kubenswrapper[4695]: W0320 10:59:03.963046 4695 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:03 crc kubenswrapper[4695]: E0320 10:59:03.963187 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:03 crc kubenswrapper[4695]: W0320 10:59:03.963260 4695 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27214": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:03 crc kubenswrapper[4695]: E0320 10:59:03.963317 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27214\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.233213 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.234212 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.234941 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.235717 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.394458 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-var-lock\") pod \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.394561 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kubelet-dir\") pod \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.394656 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kube-api-access\") pod \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\" (UID: \"a82b7512-ebb9-42df-aff3-e8a7077b0fcf\") " Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.394830 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-var-lock" (OuterVolumeSpecName: "var-lock") pod "a82b7512-ebb9-42df-aff3-e8a7077b0fcf" (UID: "a82b7512-ebb9-42df-aff3-e8a7077b0fcf"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.394830 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a82b7512-ebb9-42df-aff3-e8a7077b0fcf" (UID: "a82b7512-ebb9-42df-aff3-e8a7077b0fcf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.395392 4695 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.395496 4695 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.413448 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a82b7512-ebb9-42df-aff3-e8a7077b0fcf" (UID: "a82b7512-ebb9-42df-aff3-e8a7077b0fcf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.500585 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a82b7512-ebb9-42df-aff3-e8a7077b0fcf-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.864114 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"a82b7512-ebb9-42df-aff3-e8a7077b0fcf","Type":"ContainerDied","Data":"360cd8e879fb363c0ce407fc88c4d6441f6d1b7e550ee2e9e693bc9b1a9733df"} Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.864173 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="360cd8e879fb363c0ce407fc88c4d6441f6d1b7e550ee2e9e693bc9b1a9733df" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.864142 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.869573 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.870366 4695 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d" exitCode=0 Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.870428 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="147ac7e93836f8a593983c3c3f8472c61926c1ded010bb5d6a9a16e1717c3fa4" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.891209 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.891590 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.891963 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.895174 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.897071 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.897615 4695 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.897810 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.898027 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: I0320 10:59:04.898309 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.962795 4695 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.962975 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:01:06.962935052 +0000 UTC m=+444.743540615 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.963142 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.963142 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.963223 4695 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.963354 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 11:01:06.963323791 +0000 UTC m=+444.743929354 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 10:59:04 crc kubenswrapper[4695]: W0320 10:59:04.963845 4695 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:04 crc kubenswrapper[4695]: E0320 10:59:04.963982 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.009435 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.009524 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.009575 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.009587 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.009603 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.009694 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.010213 4695 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.010233 4695 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.010243 4695 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.937363 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.957211 4695 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.957825 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.958093 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:05 crc kubenswrapper[4695]: I0320 10:59:05.958447 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:05 crc kubenswrapper[4695]: E0320 10:59:05.963537 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:05 crc kubenswrapper[4695]: E0320 10:59:05.963576 4695 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:05 crc kubenswrapper[4695]: E0320 10:59:05.963623 4695 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:05 crc kubenswrapper[4695]: E0320 10:59:05.963654 4695 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:05 crc kubenswrapper[4695]: E0320 10:59:05.963684 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 11:01:07.963655226 +0000 UTC m=+445.744260789 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:05 crc kubenswrapper[4695]: E0320 10:59:05.963737 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 11:01:07.963704947 +0000 UTC m=+445.744310520 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 10:59:06 crc kubenswrapper[4695]: W0320 10:59:06.373958 4695 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.374141 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:06 crc kubenswrapper[4695]: W0320 10:59:06.454606 4695 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27214": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.454724 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27214\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.464836 4695 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e878caab78bff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:59:01.459880959 +0000 UTC m=+319.240486522,LastTimestamp:2026-03-20 10:59:01.459880959 +0000 UTC m=+319.240486522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:59:06 crc kubenswrapper[4695]: W0320 10:59:06.589686 4695 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.589771 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:06 crc kubenswrapper[4695]: I0320 10:59:06.894727 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.903506 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.910108 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:59:06 crc kubenswrapper[4695]: E0320 10:59:06.917500 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:59:07 crc kubenswrapper[4695]: W0320 10:59:07.109316 4695 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:07 crc kubenswrapper[4695]: E0320 10:59:07.109425 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.002883 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.004047 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.046799 4695 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.047529 4695 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.047866 4695 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.048342 4695 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.048891 4695 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.048946 4695 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.049213 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="200ms" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.071349 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.072538 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.073025 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.073321 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.073712 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.170325 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.171261 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.171886 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.172390 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.172764 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.173143 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.210305 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.211049 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.211496 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.211830 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.212257 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.212715 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.249893 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="400ms" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.606369 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:59:09Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:59:09Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:59:09Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:59:09Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:6bec6e4ce9b3ff60658829df2f5980cf947458d49b97476cee1ff01ec638d309\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:b259d760a14ca994ed34d4cfc901758f180cc8d1de7f3c427baa68030e06b7c7\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252654287},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:09401dfac9bba0cd922bbab834aad7cc8ec81f91c045783406fa3f33b21788c7\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:72220febb8ec2067b8c08124888d9a7664cf187280e0b5fd5ff031f4d6cae681\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223644375},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.607396 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.608215 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.608833 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.609315 4695 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.609349 4695 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:59:09 crc kubenswrapper[4695]: E0320 10:59:09.651316 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="800ms" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.935531 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.936643 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.937116 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.937343 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.937567 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.937783 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.938053 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.977009 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.977817 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.978292 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.978527 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.978712 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.978945 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4695]: I0320 10:59:09.979618 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.011578 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.012455 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.012676 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.012839 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.013225 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.013750 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.013988 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: W0320 10:59:10.272577 4695 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27214": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:10 crc kubenswrapper[4695]: E0320 10:59:10.273132 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27214\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:10 crc kubenswrapper[4695]: E0320 10:59:10.452802 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="1.6s" Mar 20 10:59:10 crc kubenswrapper[4695]: W0320 10:59:10.700450 4695 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:10 crc kubenswrapper[4695]: E0320 10:59:10.700547 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.984364 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerStarted","Data":"5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08"} Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.985826 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.986200 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.986494 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.986756 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.986981 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4695]: I0320 10:59:10.987365 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.034106 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.034726 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.035353 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.036280 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.036799 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.037127 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.037468 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.037856 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.080238 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.081159 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.081648 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.082098 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.082383 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.082688 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.082962 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.083192 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: W0320 10:59:11.117473 4695 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:11 crc kubenswrapper[4695]: E0320 10:59:11.118972 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.379938 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.381283 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.381968 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.383141 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.383846 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.384387 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.384728 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.385187 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.385623 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.436650 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.438658 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.440646 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.441809 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.442259 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.442814 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.443675 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.444110 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: I0320 10:59:11.444662 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:11 crc kubenswrapper[4695]: W0320 10:59:11.719721 4695 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27225": dial tcp 38.102.83.51:6443: connect: connection refused Mar 20 10:59:11 crc kubenswrapper[4695]: E0320 10:59:11.720067 4695 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27225\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:59:12 crc kubenswrapper[4695]: E0320 10:59:12.054245 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="3.2s" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.379167 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.379981 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.380723 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.381100 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.381439 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.381712 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.381958 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.382968 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.383460 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.386945 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.423164 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.424721 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.425398 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.426061 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.426953 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.427303 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.427571 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.427871 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.428273 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.428602 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.645753 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.645813 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:59:12 crc kubenswrapper[4695]: E0320 10:59:12.862717 4695 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e878caab78bff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:59:01.459880959 +0000 UTC m=+319.240486522,LastTimestamp:2026-03-20 10:59:01.459880959 +0000 UTC m=+319.240486522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.890540 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.891007 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.891475 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.891767 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.893210 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.893751 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.894066 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.894386 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:12 crc kubenswrapper[4695]: I0320 10:59:12.894685 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:13 crc kubenswrapper[4695]: I0320 10:59:13.683250 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jnfk6" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:13 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:13 crc kubenswrapper[4695]: > Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.887083 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.888561 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.889521 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.903365 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.904379 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.904490 4695 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.904514 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:14 crc kubenswrapper[4695]: E0320 10:59:14.905151 4695 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.905356 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.905993 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.906389 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.907027 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.907444 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:14 crc kubenswrapper[4695]: I0320 10:59:14.908114 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.015882 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.017862 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.017959 4695 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519" exitCode=1 Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.018056 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519"} Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.018857 4695 scope.go:117] "RemoveContainer" containerID="33cdb444e3a559600355349111a57f3f6a8fdb565abafb0d1feb304f06a6f519" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.019312 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.019554 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.019971 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.020465 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e3f15dd23e38e8bf4419d2f9ee2e1aaa1cef9dbea583625eb416e80d4bbee46e"} Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.020482 4695 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.020772 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.021211 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.021424 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.021603 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.021841 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.022094 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:15 crc kubenswrapper[4695]: E0320 10:59:15.255830 4695 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="6.4s" Mar 20 10:59:15 crc kubenswrapper[4695]: I0320 10:59:15.350948 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.030771 4695 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="f58729149bb8f035aacf1bb68602ed4425fd404c61a9c1a28268c6b7ab488dab" exitCode=0 Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.030887 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"f58729149bb8f035aacf1bb68602ed4425fd404c61a9c1a28268c6b7ab488dab"} Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.031292 4695 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.031317 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:16 crc kubenswrapper[4695]: E0320 10:59:16.032312 4695 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.032635 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.032979 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.033423 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.033754 4695 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.034052 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.034648 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.034986 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.035253 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.035484 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.035736 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.035764 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.036476 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.036553 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"21c8769f8942161d8a21ab94d1bafc7aaf8cebc1c9788ae8f9a5d8d4215c771c"} Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.037343 4695 status_manager.go:851] "Failed to get status for pod" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" pod="openshift-marketplace/community-operators-mzrcb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-mzrcb\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.037887 4695 status_manager.go:851] "Failed to get status for pod" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" pod="openshift-marketplace/redhat-marketplace-p78dm" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-p78dm\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.038186 4695 status_manager.go:851] "Failed to get status for pod" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" pod="openshift-marketplace/redhat-operators-jnfk6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-jnfk6\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.038482 4695 status_manager.go:851] "Failed to get status for pod" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" pod="openshift-marketplace/community-operators-6fqn2" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-6fqn2\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.038850 4695 status_manager.go:851] "Failed to get status for pod" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" pod="openshift-marketplace/redhat-operators-kt9vc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-kt9vc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.039223 4695 status_manager.go:851] "Failed to get status for pod" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.039512 4695 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.039835 4695 status_manager.go:851] "Failed to get status for pod" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" pod="openshift-marketplace/certified-operators-shq4g" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-shq4g\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.040156 4695 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.040396 4695 status_manager.go:851] "Failed to get status for pod" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" pod="openshift-marketplace/redhat-marketplace-6mw7h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-6mw7h\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 20 10:59:16 crc kubenswrapper[4695]: I0320 10:59:16.229254 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:59:17 crc kubenswrapper[4695]: I0320 10:59:17.051921 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"383865c2dc6ea8e5e80fececc8790ada43f5a807acae7c2760a0caff8ad32cc0"} Mar 20 10:59:17 crc kubenswrapper[4695]: I0320 10:59:17.052420 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1bd991e955c88a69b53e95ef7ffe21fb91e46145589077dd52a8e071cec8c9a3"} Mar 20 10:59:17 crc kubenswrapper[4695]: I0320 10:59:17.052450 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8c25d30baead067dcf98b1cf3a0086043fce077a24eef9730374bee851cdcd1d"} Mar 20 10:59:17 crc kubenswrapper[4695]: I0320 10:59:17.886717 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:59:18 crc kubenswrapper[4695]: I0320 10:59:18.064270 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d2f8fcb543868fc0091db28d8e3b144b8560ae2b98cc72969007e23bc38d8d95"} Mar 20 10:59:18 crc kubenswrapper[4695]: I0320 10:59:18.064352 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"04399ac467152164cafd705d7246e3f161b7b199fec5da2718c06ceed77da9c9"} Mar 20 10:59:18 crc kubenswrapper[4695]: I0320 10:59:18.065041 4695 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:18 crc kubenswrapper[4695]: I0320 10:59:18.065097 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:18 crc kubenswrapper[4695]: I0320 10:59:18.668203 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:59:18 crc kubenswrapper[4695]: I0320 10:59:18.673169 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:59:19 crc kubenswrapper[4695]: I0320 10:59:19.886101 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:59:19 crc kubenswrapper[4695]: I0320 10:59:19.906212 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:19 crc kubenswrapper[4695]: I0320 10:59:19.906268 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:19 crc kubenswrapper[4695]: I0320 10:59:19.911741 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:20 crc kubenswrapper[4695]: I0320 10:59:20.450398 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:59:21 crc kubenswrapper[4695]: I0320 10:59:21.890418 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:59:22 crc kubenswrapper[4695]: I0320 10:59:22.691475 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:59:22 crc kubenswrapper[4695]: I0320 10:59:22.733594 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.079223 4695 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.104080 4695 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.104151 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.104096 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.109238 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.305730 4695 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8e2bc4de-2734-4579-b205-06828d2f82c6" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.343588 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.461811 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:59:23 crc kubenswrapper[4695]: I0320 10:59:23.853632 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:59:24 crc kubenswrapper[4695]: I0320 10:59:24.114740 4695 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:24 crc kubenswrapper[4695]: I0320 10:59:24.114781 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:24 crc kubenswrapper[4695]: I0320 10:59:24.118187 4695 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8e2bc4de-2734-4579-b205-06828d2f82c6" Mar 20 10:59:25 crc kubenswrapper[4695]: I0320 10:59:25.120954 4695 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:25 crc kubenswrapper[4695]: I0320 10:59:25.121568 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="65c5b07a-a076-493a-8d05-5b297c74da55" Mar 20 10:59:25 crc kubenswrapper[4695]: I0320 10:59:25.125144 4695 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="8e2bc4de-2734-4579-b205-06828d2f82c6" Mar 20 10:59:26 crc kubenswrapper[4695]: I0320 10:59:26.233120 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:59:32 crc kubenswrapper[4695]: I0320 10:59:32.662085 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.219059 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.365822 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.512830 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.725128 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.838844 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.928568 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:59:33 crc kubenswrapper[4695]: I0320 10:59:33.960568 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:59:34 crc kubenswrapper[4695]: I0320 10:59:34.018723 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:59:34 crc kubenswrapper[4695]: I0320 10:59:34.094186 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:59:34 crc kubenswrapper[4695]: I0320 10:59:34.686139 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:59:34 crc kubenswrapper[4695]: I0320 10:59:34.870020 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:59:34 crc kubenswrapper[4695]: I0320 10:59:34.898688 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:59:34 crc kubenswrapper[4695]: I0320 10:59:34.900290 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.030890 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.345675 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.556552 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.567791 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.698585 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.854331 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.895740 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.936058 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.938164 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.953752 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:59:35 crc kubenswrapper[4695]: I0320 10:59:35.965474 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.108508 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.128695 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.174460 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.224162 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.245372 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.304089 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.367399 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.470986 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.643823 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.720401 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.801731 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.921198 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:59:36 crc kubenswrapper[4695]: I0320 10:59:36.948509 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.141786 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.190946 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.250742 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.276117 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.284617 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.363145 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.442500 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.490050 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.490206 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.508315 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.618720 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.660978 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.673944 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.719121 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.734465 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.859023 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.862931 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.915134 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.921509 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:59:37 crc kubenswrapper[4695]: I0320 10:59:37.940331 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.032045 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.062527 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.139247 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.186879 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.247270 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.267383 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.268504 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.351044 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.436589 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.478038 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.576173 4695 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.676484 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.739777 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.828462 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:59:38 crc kubenswrapper[4695]: I0320 10:59:38.918430 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.073086 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.075645 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.107882 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.209114 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.210202 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.386041 4695 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.657456 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.676219 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.676795 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.677048 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.677262 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.677440 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.677629 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.677924 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.682691 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.703700 4695 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.706176 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jnfk6" podStartSLOduration=32.953578081 podStartE2EDuration="2m17.706143745s" podCreationTimestamp="2026-03-20 10:57:22 +0000 UTC" firstStartedPulling="2026-03-20 10:57:24.145205932 +0000 UTC m=+221.925811495" lastFinishedPulling="2026-03-20 10:59:08.897771596 +0000 UTC m=+326.678377159" observedRunningTime="2026-03-20 10:59:23.194220048 +0000 UTC m=+340.974825621" watchObservedRunningTime="2026-03-20 10:59:39.706143745 +0000 UTC m=+357.486749308" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.709311 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=38.70926118 podStartE2EDuration="38.70926118s" podCreationTimestamp="2026-03-20 10:59:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:23.092658907 +0000 UTC m=+340.873264470" watchObservedRunningTime="2026-03-20 10:59:39.70926118 +0000 UTC m=+357.489866753" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.713541 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.713635 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.718125 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.718762 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.741411 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.741381809 podStartE2EDuration="16.741381809s" podCreationTimestamp="2026-03-20 10:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:39.739407701 +0000 UTC m=+357.520013284" watchObservedRunningTime="2026-03-20 10:59:39.741381809 +0000 UTC m=+357.521987372" Mar 20 10:59:39 crc kubenswrapper[4695]: I0320 10:59:39.814288 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.034165 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.047750 4695 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.112408 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.122606 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.150002 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.185982 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.234317 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.245689 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.269327 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.271270 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.317014 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.339068 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.366717 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.380274 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.391178 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.464439 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.468477 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.469043 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.486105 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.587471 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.620149 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.633968 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.663817 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.687148 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.703054 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.717059 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.729299 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.730696 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.756449 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.842904 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.845024 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.870818 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:59:40 crc kubenswrapper[4695]: I0320 10:59:40.967192 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.050031 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.052366 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.083375 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.109429 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.209523 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.219647 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.458544 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.479797 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.485384 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.535531 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.543768 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.549867 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.576599 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.725671 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.782205 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.801705 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.801856 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.849953 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.851496 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.932995 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:59:41 crc kubenswrapper[4695]: I0320 10:59:41.961394 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.033770 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.066852 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.099989 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.222165 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.352951 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.513860 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.521412 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.657873 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.709735 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.877017 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.888315 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:59:42 crc kubenswrapper[4695]: I0320 10:59:42.969505 4695 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.071607 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.106063 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.217564 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.245830 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.333957 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.381566 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.385680 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.394668 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.428493 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.481203 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.599498 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.629545 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.656115 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.694452 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.698539 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.723045 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.802742 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.805405 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.825677 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.838468 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.941198 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:59:43 crc kubenswrapper[4695]: I0320 10:59:43.957666 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.058715 4695 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.156538 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.166181 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.174820 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.189217 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.243589 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.304832 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.305423 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.327189 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.339118 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.339957 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.421964 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.424417 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.466683 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.473641 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.501413 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.602170 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.631388 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.633334 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.633383 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.642833 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.665934 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.688666 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.801954 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.875712 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:44 crc kubenswrapper[4695]: I0320 10:59:44.964368 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.019308 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.201181 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.201518 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.256270 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.265193 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.283830 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.389470 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.395421 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.477209 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.616459 4695 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.616824 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147" gracePeriod=5 Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.628458 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.666733 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.684629 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.904715 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.916507 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.938829 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:59:45 crc kubenswrapper[4695]: I0320 10:59:45.973359 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.013985 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.131658 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.140220 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.294948 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.366738 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.564017 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.617127 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.661217 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.888684 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.954268 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.959491 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.964874 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:59:46 crc kubenswrapper[4695]: I0320 10:59:46.977647 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.034468 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.223272 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.305774 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.558348 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.639729 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.764585 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:59:47 crc kubenswrapper[4695]: I0320 10:59:47.937474 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:59:48 crc kubenswrapper[4695]: I0320 10:59:48.157744 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:59:48 crc kubenswrapper[4695]: I0320 10:59:48.327270 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:59:48 crc kubenswrapper[4695]: I0320 10:59:48.333854 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:59:48 crc kubenswrapper[4695]: I0320 10:59:48.481952 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:59:48 crc kubenswrapper[4695]: I0320 10:59:48.495550 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:59:49 crc kubenswrapper[4695]: I0320 10:59:49.393746 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:59:50 crc kubenswrapper[4695]: I0320 10:59:50.751904 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.197637 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.197823 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.295221 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.295306 4695 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147" exitCode=137 Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.295402 4695 scope.go:117] "RemoveContainer" containerID="d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.295588 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.316549 4695 scope.go:117] "RemoveContainer" containerID="d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147" Mar 20 10:59:51 crc kubenswrapper[4695]: E0320 10:59:51.317337 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147\": container with ID starting with d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147 not found: ID does not exist" containerID="d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.317414 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147"} err="failed to get container status \"d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147\": rpc error: code = NotFound desc = could not find container \"d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147\": container with ID starting with d1b84c2a233b0a09f98d8609c24d637449dc6b4f8f916207e1b4cb5e78c31147 not found: ID does not exist" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.346931 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347041 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347065 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347182 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347055 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347274 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347105 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347383 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347596 4695 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347614 4695 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347627 4695 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.347699 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.356511 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.448723 4695 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:51 crc kubenswrapper[4695]: I0320 10:59:51.448761 4695 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:52 crc kubenswrapper[4695]: I0320 10:59:52.893638 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 10:59:52 crc kubenswrapper[4695]: I0320 10:59:52.893968 4695 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 10:59:52 crc kubenswrapper[4695]: I0320 10:59:52.906028 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:52 crc kubenswrapper[4695]: I0320 10:59:52.906083 4695 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2da66378-fcfe-4aa9-8a6d-e9e716dadfa5" Mar 20 10:59:52 crc kubenswrapper[4695]: I0320 10:59:52.909879 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 10:59:52 crc kubenswrapper[4695]: I0320 10:59:52.909969 4695 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2da66378-fcfe-4aa9-8a6d-e9e716dadfa5" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.189299 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566740-wbqkd"] Mar 20 11:00:00 crc kubenswrapper[4695]: E0320 11:00:00.189751 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" containerName="installer" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.189771 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" containerName="installer" Mar 20 11:00:00 crc kubenswrapper[4695]: E0320 11:00:00.189783 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.189791 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.189969 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.189986 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="a82b7512-ebb9-42df-aff3-e8a7077b0fcf" containerName="installer" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.190753 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.195697 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.196093 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.200552 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.207693 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj"] Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.208877 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.215560 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.219630 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj"] Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.224771 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.232933 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-wbqkd"] Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.278061 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpb6h\" (UniqueName: \"kubernetes.io/projected/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-kube-api-access-xpb6h\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.278150 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-config-volume\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.278352 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-secret-volume\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.278475 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlfcl\" (UniqueName: \"kubernetes.io/projected/a27fd160-2f93-4820-b5cf-566db0042594-kube-api-access-rlfcl\") pod \"auto-csr-approver-29566740-wbqkd\" (UID: \"a27fd160-2f93-4820-b5cf-566db0042594\") " pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.379722 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpb6h\" (UniqueName: \"kubernetes.io/projected/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-kube-api-access-xpb6h\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.379845 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-config-volume\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.379969 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-secret-volume\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.380011 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlfcl\" (UniqueName: \"kubernetes.io/projected/a27fd160-2f93-4820-b5cf-566db0042594-kube-api-access-rlfcl\") pod \"auto-csr-approver-29566740-wbqkd\" (UID: \"a27fd160-2f93-4820-b5cf-566db0042594\") " pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.381061 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-config-volume\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.387561 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-secret-volume\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.397728 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlfcl\" (UniqueName: \"kubernetes.io/projected/a27fd160-2f93-4820-b5cf-566db0042594-kube-api-access-rlfcl\") pod \"auto-csr-approver-29566740-wbqkd\" (UID: \"a27fd160-2f93-4820-b5cf-566db0042594\") " pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.402141 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpb6h\" (UniqueName: \"kubernetes.io/projected/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-kube-api-access-xpb6h\") pod \"collect-profiles-29566740-82jvj\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.512241 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.533061 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.822409 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj"] Mar 20 11:00:00 crc kubenswrapper[4695]: I0320 11:00:00.960900 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-wbqkd"] Mar 20 11:00:01 crc kubenswrapper[4695]: I0320 11:00:01.366723 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" event={"ID":"a27fd160-2f93-4820-b5cf-566db0042594","Type":"ContainerStarted","Data":"98792eb225e17627b05e2eeb79ddb30883aa0e40cb0013bbffdfd1ef89b60ec8"} Mar 20 11:00:01 crc kubenswrapper[4695]: I0320 11:00:01.369449 4695 generic.go:334] "Generic (PLEG): container finished" podID="940bd5a7-5746-4008-9f5a-84a7c0dc4c15" containerID="d4befc3dbecc6e35a03b7414591ca6efd36fab355e376ecdfd0caaaba1353d85" exitCode=0 Mar 20 11:00:01 crc kubenswrapper[4695]: I0320 11:00:01.369518 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" event={"ID":"940bd5a7-5746-4008-9f5a-84a7c0dc4c15","Type":"ContainerDied","Data":"d4befc3dbecc6e35a03b7414591ca6efd36fab355e376ecdfd0caaaba1353d85"} Mar 20 11:00:01 crc kubenswrapper[4695]: I0320 11:00:01.369562 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" event={"ID":"940bd5a7-5746-4008-9f5a-84a7c0dc4c15","Type":"ContainerStarted","Data":"f393c04e0d200712d1e28b70a5ab38446bad6d9b07195d933948eb2316131105"} Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.635601 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.747423 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-secret-volume\") pod \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.747579 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-config-volume\") pod \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.747690 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpb6h\" (UniqueName: \"kubernetes.io/projected/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-kube-api-access-xpb6h\") pod \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\" (UID: \"940bd5a7-5746-4008-9f5a-84a7c0dc4c15\") " Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.748822 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-config-volume" (OuterVolumeSpecName: "config-volume") pod "940bd5a7-5746-4008-9f5a-84a7c0dc4c15" (UID: "940bd5a7-5746-4008-9f5a-84a7c0dc4c15"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.754039 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "940bd5a7-5746-4008-9f5a-84a7c0dc4c15" (UID: "940bd5a7-5746-4008-9f5a-84a7c0dc4c15"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.754171 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-kube-api-access-xpb6h" (OuterVolumeSpecName: "kube-api-access-xpb6h") pod "940bd5a7-5746-4008-9f5a-84a7c0dc4c15" (UID: "940bd5a7-5746-4008-9f5a-84a7c0dc4c15"). InnerVolumeSpecName "kube-api-access-xpb6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.849631 4695 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.849669 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpb6h\" (UniqueName: \"kubernetes.io/projected/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-kube-api-access-xpb6h\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:02 crc kubenswrapper[4695]: I0320 11:00:02.849686 4695 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/940bd5a7-5746-4008-9f5a-84a7c0dc4c15-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4695]: I0320 11:00:03.385474 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" event={"ID":"940bd5a7-5746-4008-9f5a-84a7c0dc4c15","Type":"ContainerDied","Data":"f393c04e0d200712d1e28b70a5ab38446bad6d9b07195d933948eb2316131105"} Mar 20 11:00:03 crc kubenswrapper[4695]: I0320 11:00:03.385542 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f393c04e0d200712d1e28b70a5ab38446bad6d9b07195d933948eb2316131105" Mar 20 11:00:03 crc kubenswrapper[4695]: I0320 11:00:03.385559 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj" Mar 20 11:00:04 crc kubenswrapper[4695]: I0320 11:00:04.485122 4695 scope.go:117] "RemoveContainer" containerID="4293190000cd989e69d734267dafd4e78201fb6162849deba3c88ff9846e0832" Mar 20 11:00:04 crc kubenswrapper[4695]: I0320 11:00:04.515199 4695 scope.go:117] "RemoveContainer" containerID="a702a75901ce7b42462eb3e811286522935e0f3d75e4ccb8749a0a4e49594519" Mar 20 11:00:04 crc kubenswrapper[4695]: I0320 11:00:04.562430 4695 scope.go:117] "RemoveContainer" containerID="01484b70548936d6b63401a723acc7388a150160cbbcb3343ee5288c72d72276" Mar 20 11:00:04 crc kubenswrapper[4695]: I0320 11:00:04.589794 4695 scope.go:117] "RemoveContainer" containerID="b08d1d2f73aec8f5d2f5aa0cd3977c0451a1878e8aa38f985c1c891e12777e4c" Mar 20 11:00:04 crc kubenswrapper[4695]: I0320 11:00:04.608217 4695 scope.go:117] "RemoveContainer" containerID="432851621b6f9a1072e08a0d003deaeb97a60f8e72c7a48604af05eabb08b58d" Mar 20 11:00:06 crc kubenswrapper[4695]: I0320 11:00:06.410853 4695 generic.go:334] "Generic (PLEG): container finished" podID="a27fd160-2f93-4820-b5cf-566db0042594" containerID="6945876e53842900f6e1cdedca0308f6f7034a64d87aab98548ef4d7d5c67e89" exitCode=0 Mar 20 11:00:06 crc kubenswrapper[4695]: I0320 11:00:06.410928 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" event={"ID":"a27fd160-2f93-4820-b5cf-566db0042594","Type":"ContainerDied","Data":"6945876e53842900f6e1cdedca0308f6f7034a64d87aab98548ef4d7d5c67e89"} Mar 20 11:00:07 crc kubenswrapper[4695]: I0320 11:00:07.693095 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:07 crc kubenswrapper[4695]: I0320 11:00:07.836168 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlfcl\" (UniqueName: \"kubernetes.io/projected/a27fd160-2f93-4820-b5cf-566db0042594-kube-api-access-rlfcl\") pod \"a27fd160-2f93-4820-b5cf-566db0042594\" (UID: \"a27fd160-2f93-4820-b5cf-566db0042594\") " Mar 20 11:00:07 crc kubenswrapper[4695]: I0320 11:00:07.844247 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a27fd160-2f93-4820-b5cf-566db0042594-kube-api-access-rlfcl" (OuterVolumeSpecName: "kube-api-access-rlfcl") pod "a27fd160-2f93-4820-b5cf-566db0042594" (UID: "a27fd160-2f93-4820-b5cf-566db0042594"). InnerVolumeSpecName "kube-api-access-rlfcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:07 crc kubenswrapper[4695]: I0320 11:00:07.939588 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlfcl\" (UniqueName: \"kubernetes.io/projected/a27fd160-2f93-4820-b5cf-566db0042594-kube-api-access-rlfcl\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:08 crc kubenswrapper[4695]: I0320 11:00:08.427411 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" event={"ID":"a27fd160-2f93-4820-b5cf-566db0042594","Type":"ContainerDied","Data":"98792eb225e17627b05e2eeb79ddb30883aa0e40cb0013bbffdfd1ef89b60ec8"} Mar 20 11:00:08 crc kubenswrapper[4695]: I0320 11:00:08.427471 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98792eb225e17627b05e2eeb79ddb30883aa0e40cb0013bbffdfd1ef89b60ec8" Mar 20 11:00:08 crc kubenswrapper[4695]: I0320 11:00:08.427895 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-wbqkd" Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.144935 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6fqn2"] Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.146330 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6fqn2" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="registry-server" containerID="cri-o://39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611" gracePeriod=2 Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.340936 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shq4g"] Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.341380 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shq4g" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="registry-server" containerID="cri-o://861dfc4058068d4c4f57d676e81bd038ec64ab10c904d822de8da01044c941df" gracePeriod=2 Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.963426 4695 generic.go:334] "Generic (PLEG): container finished" podID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerID="861dfc4058068d4c4f57d676e81bd038ec64ab10c904d822de8da01044c941df" exitCode=0 Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.963602 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerDied","Data":"861dfc4058068d4c4f57d676e81bd038ec64ab10c904d822de8da01044c941df"} Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.971442 4695 generic.go:334] "Generic (PLEG): container finished" podID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerID="39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611" exitCode=0 Mar 20 11:00:28 crc kubenswrapper[4695]: I0320 11:00:28.971503 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerDied","Data":"39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611"} Mar 20 11:00:29 crc kubenswrapper[4695]: E0320 11:00:29.129786 4695 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611 is running failed: container process not found" containerID="39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 11:00:29 crc kubenswrapper[4695]: E0320 11:00:29.131534 4695 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611 is running failed: container process not found" containerID="39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 11:00:29 crc kubenswrapper[4695]: E0320 11:00:29.132270 4695 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611 is running failed: container process not found" containerID="39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 11:00:29 crc kubenswrapper[4695]: E0320 11:00:29.132346 4695 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-6fqn2" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="registry-server" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.171799 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.297214 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.341484 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-utilities\") pod \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.341657 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-catalog-content\") pod \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.341760 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lppf4\" (UniqueName: \"kubernetes.io/projected/33f393cc-11cf-4c7a-aeac-8423998e5dc6-kube-api-access-lppf4\") pod \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\" (UID: \"33f393cc-11cf-4c7a-aeac-8423998e5dc6\") " Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.342700 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-utilities" (OuterVolumeSpecName: "utilities") pod "33f393cc-11cf-4c7a-aeac-8423998e5dc6" (UID: "33f393cc-11cf-4c7a-aeac-8423998e5dc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.348584 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33f393cc-11cf-4c7a-aeac-8423998e5dc6-kube-api-access-lppf4" (OuterVolumeSpecName: "kube-api-access-lppf4") pod "33f393cc-11cf-4c7a-aeac-8423998e5dc6" (UID: "33f393cc-11cf-4c7a-aeac-8423998e5dc6"). InnerVolumeSpecName "kube-api-access-lppf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.403279 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "33f393cc-11cf-4c7a-aeac-8423998e5dc6" (UID: "33f393cc-11cf-4c7a-aeac-8423998e5dc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.442830 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5c4mc\" (UniqueName: \"kubernetes.io/projected/eb233657-545c-4a0b-93a0-b29148b5cb3f-kube-api-access-5c4mc\") pod \"eb233657-545c-4a0b-93a0-b29148b5cb3f\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.442947 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-catalog-content\") pod \"eb233657-545c-4a0b-93a0-b29148b5cb3f\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.443021 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-utilities\") pod \"eb233657-545c-4a0b-93a0-b29148b5cb3f\" (UID: \"eb233657-545c-4a0b-93a0-b29148b5cb3f\") " Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.443398 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.443419 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33f393cc-11cf-4c7a-aeac-8423998e5dc6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.443438 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lppf4\" (UniqueName: \"kubernetes.io/projected/33f393cc-11cf-4c7a-aeac-8423998e5dc6-kube-api-access-lppf4\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.444482 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-utilities" (OuterVolumeSpecName: "utilities") pod "eb233657-545c-4a0b-93a0-b29148b5cb3f" (UID: "eb233657-545c-4a0b-93a0-b29148b5cb3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.450210 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb233657-545c-4a0b-93a0-b29148b5cb3f-kube-api-access-5c4mc" (OuterVolumeSpecName: "kube-api-access-5c4mc") pod "eb233657-545c-4a0b-93a0-b29148b5cb3f" (UID: "eb233657-545c-4a0b-93a0-b29148b5cb3f"). InnerVolumeSpecName "kube-api-access-5c4mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.497665 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb233657-545c-4a0b-93a0-b29148b5cb3f" (UID: "eb233657-545c-4a0b-93a0-b29148b5cb3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.545557 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5c4mc\" (UniqueName: \"kubernetes.io/projected/eb233657-545c-4a0b-93a0-b29148b5cb3f-kube-api-access-5c4mc\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.545605 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.545637 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb233657-545c-4a0b-93a0-b29148b5cb3f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.980231 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shq4g" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.980233 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shq4g" event={"ID":"eb233657-545c-4a0b-93a0-b29148b5cb3f","Type":"ContainerDied","Data":"4083af247e308dd9d23e806a531aafdedae5a587b6446114c4f3b8d0f80a0831"} Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.980402 4695 scope.go:117] "RemoveContainer" containerID="861dfc4058068d4c4f57d676e81bd038ec64ab10c904d822de8da01044c941df" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.983087 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fqn2" event={"ID":"33f393cc-11cf-4c7a-aeac-8423998e5dc6","Type":"ContainerDied","Data":"10111797dfa1f248c9c68d890d951f4866214f29b6bce291b2c0deff39fbaba8"} Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.983173 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fqn2" Mar 20 11:00:29 crc kubenswrapper[4695]: I0320 11:00:29.998461 4695 scope.go:117] "RemoveContainer" containerID="4fb592d7ee5f03ec779af1f27c3247e880a518de5546ad8b20d95bc5c009e608" Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.017583 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6fqn2"] Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.025666 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6fqn2"] Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.030613 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shq4g"] Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.035228 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shq4g"] Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.037869 4695 scope.go:117] "RemoveContainer" containerID="9c4785c8b59b6752da47b7ed3c9a6793c11f1117df037f729145ea5097a1b990" Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.052536 4695 scope.go:117] "RemoveContainer" containerID="39fa253f4fb4d8b339cbed6aefb5697ca3d071430c6b22982434c71141011611" Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.074151 4695 scope.go:117] "RemoveContainer" containerID="3aabcce0ae04970822f581437e36f97e5858515e319abee4128f5097a8fcc374" Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.101749 4695 scope.go:117] "RemoveContainer" containerID="8488c576337e96c6eb84aa8b7ad6744195af2acdba5c5fc36f1f8ec1a83c658b" Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.540565 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mw7h"] Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.541290 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6mw7h" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="registry-server" containerID="cri-o://7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" gracePeriod=2 Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.740185 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jnfk6"] Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.740530 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jnfk6" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="registry-server" containerID="cri-o://5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08" gracePeriod=2 Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.896482 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" path="/var/lib/kubelet/pods/33f393cc-11cf-4c7a-aeac-8423998e5dc6/volumes" Mar 20 11:00:30 crc kubenswrapper[4695]: I0320 11:00:30.897448 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" path="/var/lib/kubelet/pods/eb233657-545c-4a0b-93a0-b29148b5cb3f/volumes" Mar 20 11:00:31 crc kubenswrapper[4695]: E0320 11:00:31.321251 4695 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29 is running failed: container process not found" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 11:00:31 crc kubenswrapper[4695]: E0320 11:00:31.321654 4695 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29 is running failed: container process not found" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 11:00:31 crc kubenswrapper[4695]: E0320 11:00:31.322147 4695 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29 is running failed: container process not found" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 11:00:31 crc kubenswrapper[4695]: E0320 11:00:31.322239 4695 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-6mw7h" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="registry-server" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.457410 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.575950 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-catalog-content\") pod \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.576040 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5h4f\" (UniqueName: \"kubernetes.io/projected/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-kube-api-access-w5h4f\") pod \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.576079 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-utilities\") pod \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\" (UID: \"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff\") " Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.579392 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-utilities" (OuterVolumeSpecName: "utilities") pod "d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" (UID: "d2ea4f1f-16e3-4ac7-ac16-f782b94669ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.587931 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-kube-api-access-w5h4f" (OuterVolumeSpecName: "kube-api-access-w5h4f") pod "d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" (UID: "d2ea4f1f-16e3-4ac7-ac16-f782b94669ff"). InnerVolumeSpecName "kube-api-access-w5h4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.626666 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.655658 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" (UID: "d2ea4f1f-16e3-4ac7-ac16-f782b94669ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.677854 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.677917 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5h4f\" (UniqueName: \"kubernetes.io/projected/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-kube-api-access-w5h4f\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.677936 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.779308 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-utilities\") pod \"2a6824e3-65ec-404c-ac28-59fce8d50d83\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.779474 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-catalog-content\") pod \"2a6824e3-65ec-404c-ac28-59fce8d50d83\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.779531 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw9nb\" (UniqueName: \"kubernetes.io/projected/2a6824e3-65ec-404c-ac28-59fce8d50d83-kube-api-access-hw9nb\") pod \"2a6824e3-65ec-404c-ac28-59fce8d50d83\" (UID: \"2a6824e3-65ec-404c-ac28-59fce8d50d83\") " Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.780703 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-utilities" (OuterVolumeSpecName: "utilities") pod "2a6824e3-65ec-404c-ac28-59fce8d50d83" (UID: "2a6824e3-65ec-404c-ac28-59fce8d50d83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.783356 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6824e3-65ec-404c-ac28-59fce8d50d83-kube-api-access-hw9nb" (OuterVolumeSpecName: "kube-api-access-hw9nb") pod "2a6824e3-65ec-404c-ac28-59fce8d50d83" (UID: "2a6824e3-65ec-404c-ac28-59fce8d50d83"). InnerVolumeSpecName "kube-api-access-hw9nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.881582 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.881631 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw9nb\" (UniqueName: \"kubernetes.io/projected/2a6824e3-65ec-404c-ac28-59fce8d50d83-kube-api-access-hw9nb\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.906712 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2a6824e3-65ec-404c-ac28-59fce8d50d83" (UID: "2a6824e3-65ec-404c-ac28-59fce8d50d83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:31 crc kubenswrapper[4695]: I0320 11:00:31.983514 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2a6824e3-65ec-404c-ac28-59fce8d50d83-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.004260 4695 generic.go:334] "Generic (PLEG): container finished" podID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerID="5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08" exitCode=0 Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.004353 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerDied","Data":"5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08"} Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.004417 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jnfk6" event={"ID":"2a6824e3-65ec-404c-ac28-59fce8d50d83","Type":"ContainerDied","Data":"1e9cbb31bbd2e8b0cc51af61268dffd96783521b37323cee50ed83dad62895dd"} Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.004448 4695 scope.go:117] "RemoveContainer" containerID="5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.004625 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jnfk6" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.006863 4695 generic.go:334] "Generic (PLEG): container finished" podID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" exitCode=0 Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.006920 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerDied","Data":"7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29"} Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.006941 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6mw7h" event={"ID":"d2ea4f1f-16e3-4ac7-ac16-f782b94669ff","Type":"ContainerDied","Data":"91e103e42a0e829f6be656cfc924e846a4d7d3968b5da5158b6de0d329fa72ba"} Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.007016 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6mw7h" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.028570 4695 scope.go:117] "RemoveContainer" containerID="993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.049296 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mw7h"] Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.049746 4695 scope.go:117] "RemoveContainer" containerID="773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.054644 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6mw7h"] Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.066299 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jnfk6"] Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.070150 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jnfk6"] Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.071519 4695 scope.go:117] "RemoveContainer" containerID="5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08" Mar 20 11:00:32 crc kubenswrapper[4695]: E0320 11:00:32.072045 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08\": container with ID starting with 5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08 not found: ID does not exist" containerID="5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.072081 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08"} err="failed to get container status \"5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08\": rpc error: code = NotFound desc = could not find container \"5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08\": container with ID starting with 5a830ceb7bbdd95e6b3b71760c04f7298528598d10fe78bb2af5d0010f83de08 not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.072108 4695 scope.go:117] "RemoveContainer" containerID="993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7" Mar 20 11:00:32 crc kubenswrapper[4695]: E0320 11:00:32.072286 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7\": container with ID starting with 993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7 not found: ID does not exist" containerID="993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.072314 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7"} err="failed to get container status \"993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7\": rpc error: code = NotFound desc = could not find container \"993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7\": container with ID starting with 993b9e6920765275d4bbed6b9be2801725022b179539b3816889fde842587fb7 not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.072329 4695 scope.go:117] "RemoveContainer" containerID="773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f" Mar 20 11:00:32 crc kubenswrapper[4695]: E0320 11:00:32.072651 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f\": container with ID starting with 773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f not found: ID does not exist" containerID="773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.072720 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f"} err="failed to get container status \"773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f\": rpc error: code = NotFound desc = could not find container \"773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f\": container with ID starting with 773b064b92375274fd0c30d78da235737658fd1744d6b198ed58e3640851551f not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.072771 4695 scope.go:117] "RemoveContainer" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.087728 4695 scope.go:117] "RemoveContainer" containerID="67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.101868 4695 scope.go:117] "RemoveContainer" containerID="d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.121384 4695 scope.go:117] "RemoveContainer" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" Mar 20 11:00:32 crc kubenswrapper[4695]: E0320 11:00:32.121925 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29\": container with ID starting with 7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29 not found: ID does not exist" containerID="7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.121970 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29"} err="failed to get container status \"7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29\": rpc error: code = NotFound desc = could not find container \"7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29\": container with ID starting with 7e6a5ba88866cb4447a1dc39d8d44d8ccfed81d6432188d2eb5a558ea2b9da29 not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.121994 4695 scope.go:117] "RemoveContainer" containerID="67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a" Mar 20 11:00:32 crc kubenswrapper[4695]: E0320 11:00:32.122297 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a\": container with ID starting with 67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a not found: ID does not exist" containerID="67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.122324 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a"} err="failed to get container status \"67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a\": rpc error: code = NotFound desc = could not find container \"67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a\": container with ID starting with 67a8e6fa1bf53ea14c96131e0c2f7614d425126facee389202f7ba0cfb63016a not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.122348 4695 scope.go:117] "RemoveContainer" containerID="d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467" Mar 20 11:00:32 crc kubenswrapper[4695]: E0320 11:00:32.122824 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467\": container with ID starting with d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467 not found: ID does not exist" containerID="d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.122850 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467"} err="failed to get container status \"d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467\": rpc error: code = NotFound desc = could not find container \"d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467\": container with ID starting with d87c64b2c8937d29aa43875b73bb1cfe6be6a70f5f761a8291e7f207518a4467 not found: ID does not exist" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.896542 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" path="/var/lib/kubelet/pods/2a6824e3-65ec-404c-ac28-59fce8d50d83/volumes" Mar 20 11:00:32 crc kubenswrapper[4695]: I0320 11:00:32.897806 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" path="/var/lib/kubelet/pods/d2ea4f1f-16e3-4ac7-ac16-f782b94669ff/volumes" Mar 20 11:00:38 crc kubenswrapper[4695]: I0320 11:00:38.431378 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:00:38 crc kubenswrapper[4695]: I0320 11:00:38.432833 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.103003 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4jh2"] Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104087 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104107 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104116 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104123 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104130 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104137 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104150 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104159 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104169 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940bd5a7-5746-4008-9f5a-84a7c0dc4c15" containerName="collect-profiles" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104175 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="940bd5a7-5746-4008-9f5a-84a7c0dc4c15" containerName="collect-profiles" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104184 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104191 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104202 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104210 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104218 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104224 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104233 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104239 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104251 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104257 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104263 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104269 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="extract-content" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104288 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a27fd160-2f93-4820-b5cf-566db0042594" containerName="oc" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104294 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="a27fd160-2f93-4820-b5cf-566db0042594" containerName="oc" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104301 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104308 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: E0320 11:01:02.104315 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104321 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="extract-utilities" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104421 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="940bd5a7-5746-4008-9f5a-84a7c0dc4c15" containerName="collect-profiles" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104433 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2ea4f1f-16e3-4ac7-ac16-f782b94669ff" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104442 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb233657-545c-4a0b-93a0-b29148b5cb3f" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104450 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="a27fd160-2f93-4820-b5cf-566db0042594" containerName="oc" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104461 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6824e3-65ec-404c-ac28-59fce8d50d83" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104471 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="33f393cc-11cf-4c7a-aeac-8423998e5dc6" containerName="registry-server" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.104969 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.132588 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4jh2"] Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.275868 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-bound-sa-token\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.276202 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.276248 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.276473 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n56cd\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-kube-api-access-n56cd\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.276505 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.276796 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-registry-tls\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.276955 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-trusted-ca\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.277029 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-registry-certificates\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.300614 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.378557 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-bound-sa-token\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379128 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379161 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379191 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n56cd\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-kube-api-access-n56cd\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379247 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-registry-tls\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379289 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-trusted-ca\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379312 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-registry-certificates\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.379813 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.380966 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-trusted-ca\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.381556 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-registry-certificates\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.389655 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.389820 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-registry-tls\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.398015 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-bound-sa-token\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.399700 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n56cd\" (UniqueName: \"kubernetes.io/projected/e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7-kube-api-access-n56cd\") pod \"image-registry-66df7c8f76-w4jh2\" (UID: \"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7\") " pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.428797 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:02 crc kubenswrapper[4695]: I0320 11:01:02.712028 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-w4jh2"] Mar 20 11:01:03 crc kubenswrapper[4695]: I0320 11:01:03.225634 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" event={"ID":"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7","Type":"ContainerStarted","Data":"76c22abd57970abd89009a137fc367e9fc8c691a28510863373e26dea882a1b7"} Mar 20 11:01:03 crc kubenswrapper[4695]: I0320 11:01:03.226107 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" event={"ID":"e3fd3ae9-fb0b-40f5-b499-4fc2e0c734c7","Type":"ContainerStarted","Data":"0d762ca32f5e2342de9e054a620074c9a502f274e9ba29280da0c80b1f64c5f5"} Mar 20 11:01:03 crc kubenswrapper[4695]: I0320 11:01:03.228327 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:04 crc kubenswrapper[4695]: I0320 11:01:04.650391 4695 scope.go:117] "RemoveContainer" containerID="f2470bd643544df19b122ff383a700a10e2d0d957d2c29512e3e9fba5c129ccb" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.057826 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.058367 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.059035 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.067425 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.088892 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.290498 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" podStartSLOduration=5.290465869 podStartE2EDuration="5.290465869s" podCreationTimestamp="2026-03-20 11:01:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:01:03.249790053 +0000 UTC m=+441.030395636" watchObservedRunningTime="2026-03-20 11:01:07.290465869 +0000 UTC m=+445.071071432" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.978474 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.978930 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.982594 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:01:07 crc kubenswrapper[4695]: I0320 11:01:07.982594 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:01:08 crc kubenswrapper[4695]: I0320 11:01:08.091812 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 11:01:08 crc kubenswrapper[4695]: I0320 11:01:08.187018 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:01:08 crc kubenswrapper[4695]: I0320 11:01:08.266653 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"618a45400f67c73ea4a99e2e0a08963590f88b396f34747fa391607a2fbd1ff3"} Mar 20 11:01:08 crc kubenswrapper[4695]: I0320 11:01:08.267248 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2f4b44d47493ac42544fa1a55f0156566b7a51345bd3e659126fd019ac2328f5"} Mar 20 11:01:08 crc kubenswrapper[4695]: W0320 11:01:08.316361 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-a0b14e7dce72b7f65b6c81f28af665d8d5a29a2e51f6215dd2fd5fea4200d842 WatchSource:0}: Error finding container a0b14e7dce72b7f65b6c81f28af665d8d5a29a2e51f6215dd2fd5fea4200d842: Status 404 returned error can't find the container with id a0b14e7dce72b7f65b6c81f28af665d8d5a29a2e51f6215dd2fd5fea4200d842 Mar 20 11:01:08 crc kubenswrapper[4695]: I0320 11:01:08.430579 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:01:08 crc kubenswrapper[4695]: I0320 11:01:08.430636 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:01:09 crc kubenswrapper[4695]: I0320 11:01:09.274015 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"6f530aeb4fbd23d57697ec58f27aba7e23fce32503ce117d2c8a2733321e36cc"} Mar 20 11:01:09 crc kubenswrapper[4695]: I0320 11:01:09.274475 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e1bda1f47f6bbc7b8faa7c48f4de76956eca3a1304d85c5f978f108fe0e55e59"} Mar 20 11:01:09 crc kubenswrapper[4695]: I0320 11:01:09.274637 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:01:09 crc kubenswrapper[4695]: I0320 11:01:09.275647 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8727673ae06911f0b36ab90a8e3b91e056432cf18b98cf0ebb73f55a3b4a7b25"} Mar 20 11:01:09 crc kubenswrapper[4695]: I0320 11:01:09.275668 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"a0b14e7dce72b7f65b6c81f28af665d8d5a29a2e51f6215dd2fd5fea4200d842"} Mar 20 11:01:22 crc kubenswrapper[4695]: I0320 11:01:22.438337 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-w4jh2" Mar 20 11:01:22 crc kubenswrapper[4695]: I0320 11:01:22.510437 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h8rbk"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.770532 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28xd8"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.771301 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-28xd8" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="registry-server" containerID="cri-o://e3e9fa14ffce9bf1b0980fe833d05fa119c8c722b051c1703109c5717ad5f9fe" gracePeriod=30 Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.790345 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mzrcb"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.790800 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mzrcb" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="registry-server" containerID="cri-o://1d38591f7dfdb9068d761219345ed096498521f05df89266d75b9376745f3db4" gracePeriod=30 Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.796017 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7b9p"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.796402 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" podUID="36b62628-63ee-4520-8787-ce943f478c0b" containerName="marketplace-operator" containerID="cri-o://ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467" gracePeriod=30 Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.802003 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p78dm"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.802433 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-p78dm" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="registry-server" containerID="cri-o://a4d1fe52926db7a4a4059229f0e59594d06e2a09b3fca6a9eb924c27b8c410a4" gracePeriod=30 Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.803241 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kt9vc"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.803565 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kt9vc" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="registry-server" containerID="cri-o://913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87" gracePeriod=30 Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.820179 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5vxrg"] Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.821093 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:23 crc kubenswrapper[4695]: I0320 11:01:23.853631 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5vxrg"] Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.020452 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ec6419ea-4cf9-415f-8aba-c775cd497980-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.020669 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec6419ea-4cf9-415f-8aba-c775cd497980-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.020698 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm6h2\" (UniqueName: \"kubernetes.io/projected/ec6419ea-4cf9-415f-8aba-c775cd497980-kube-api-access-hm6h2\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.122612 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec6419ea-4cf9-415f-8aba-c775cd497980-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.123265 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm6h2\" (UniqueName: \"kubernetes.io/projected/ec6419ea-4cf9-415f-8aba-c775cd497980-kube-api-access-hm6h2\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.123302 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ec6419ea-4cf9-415f-8aba-c775cd497980-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.124074 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ec6419ea-4cf9-415f-8aba-c775cd497980-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.134954 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ec6419ea-4cf9-415f-8aba-c775cd497980-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.142564 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm6h2\" (UniqueName: \"kubernetes.io/projected/ec6419ea-4cf9-415f-8aba-c775cd497980-kube-api-access-hm6h2\") pod \"marketplace-operator-79b997595-5vxrg\" (UID: \"ec6419ea-4cf9-415f-8aba-c775cd497980\") " pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.279202 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.319702 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.428775 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-trusted-ca\") pod \"36b62628-63ee-4520-8787-ce943f478c0b\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.428849 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-operator-metrics\") pod \"36b62628-63ee-4520-8787-ce943f478c0b\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.428950 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-utilities\") pod \"53830966-0b62-40fe-9f81-c18c95ea50aa\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.429011 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvdfd\" (UniqueName: \"kubernetes.io/projected/36b62628-63ee-4520-8787-ce943f478c0b-kube-api-access-dvdfd\") pod \"36b62628-63ee-4520-8787-ce943f478c0b\" (UID: \"36b62628-63ee-4520-8787-ce943f478c0b\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.429040 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4wl4\" (UniqueName: \"kubernetes.io/projected/53830966-0b62-40fe-9f81-c18c95ea50aa-kube-api-access-g4wl4\") pod \"53830966-0b62-40fe-9f81-c18c95ea50aa\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.429064 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-catalog-content\") pod \"53830966-0b62-40fe-9f81-c18c95ea50aa\" (UID: \"53830966-0b62-40fe-9f81-c18c95ea50aa\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.430257 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "36b62628-63ee-4520-8787-ce943f478c0b" (UID: "36b62628-63ee-4520-8787-ce943f478c0b"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.434133 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "36b62628-63ee-4520-8787-ce943f478c0b" (UID: "36b62628-63ee-4520-8787-ce943f478c0b"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.434937 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53830966-0b62-40fe-9f81-c18c95ea50aa-kube-api-access-g4wl4" (OuterVolumeSpecName: "kube-api-access-g4wl4") pod "53830966-0b62-40fe-9f81-c18c95ea50aa" (UID: "53830966-0b62-40fe-9f81-c18c95ea50aa"). InnerVolumeSpecName "kube-api-access-g4wl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.436082 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-utilities" (OuterVolumeSpecName: "utilities") pod "53830966-0b62-40fe-9f81-c18c95ea50aa" (UID: "53830966-0b62-40fe-9f81-c18c95ea50aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.438096 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36b62628-63ee-4520-8787-ce943f478c0b-kube-api-access-dvdfd" (OuterVolumeSpecName: "kube-api-access-dvdfd") pod "36b62628-63ee-4520-8787-ce943f478c0b" (UID: "36b62628-63ee-4520-8787-ce943f478c0b"). InnerVolumeSpecName "kube-api-access-dvdfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.441287 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.531247 4695 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.531297 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.531313 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvdfd\" (UniqueName: \"kubernetes.io/projected/36b62628-63ee-4520-8787-ce943f478c0b-kube-api-access-dvdfd\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.531327 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4wl4\" (UniqueName: \"kubernetes.io/projected/53830966-0b62-40fe-9f81-c18c95ea50aa-kube-api-access-g4wl4\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.531342 4695 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36b62628-63ee-4520-8787-ce943f478c0b-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.626633 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53830966-0b62-40fe-9f81-c18c95ea50aa" (UID: "53830966-0b62-40fe-9f81-c18c95ea50aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.628075 4695 generic.go:334] "Generic (PLEG): container finished" podID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerID="e3e9fa14ffce9bf1b0980fe833d05fa119c8c722b051c1703109c5717ad5f9fe" exitCode=0 Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.628227 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerDied","Data":"e3e9fa14ffce9bf1b0980fe833d05fa119c8c722b051c1703109c5717ad5f9fe"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.632141 4695 generic.go:334] "Generic (PLEG): container finished" podID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerID="913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87" exitCode=0 Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.632253 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kt9vc" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.632322 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53830966-0b62-40fe-9f81-c18c95ea50aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.632391 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerDied","Data":"913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.632473 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kt9vc" event={"ID":"53830966-0b62-40fe-9f81-c18c95ea50aa","Type":"ContainerDied","Data":"58a144ff3006720da3b38d6cc9128c7f419cbf4abcb6e3f04fde0f77df6759ca"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.632503 4695 scope.go:117] "RemoveContainer" containerID="913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.641513 4695 generic.go:334] "Generic (PLEG): container finished" podID="36b62628-63ee-4520-8787-ce943f478c0b" containerID="ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467" exitCode=0 Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.641584 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.641630 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" event={"ID":"36b62628-63ee-4520-8787-ce943f478c0b","Type":"ContainerDied","Data":"ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.641751 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-w7b9p" event={"ID":"36b62628-63ee-4520-8787-ce943f478c0b","Type":"ContainerDied","Data":"2c113c144aaad3064b2842c18ca024dfec724b41312f98dbddf071aaf5136edf"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.644567 4695 generic.go:334] "Generic (PLEG): container finished" podID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerID="1d38591f7dfdb9068d761219345ed096498521f05df89266d75b9376745f3db4" exitCode=0 Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.644630 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerDied","Data":"1d38591f7dfdb9068d761219345ed096498521f05df89266d75b9376745f3db4"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.644752 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.646700 4695 generic.go:334] "Generic (PLEG): container finished" podID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerID="a4d1fe52926db7a4a4059229f0e59594d06e2a09b3fca6a9eb924c27b8c410a4" exitCode=0 Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.646730 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerDied","Data":"a4d1fe52926db7a4a4059229f0e59594d06e2a09b3fca6a9eb924c27b8c410a4"} Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.676488 4695 scope.go:117] "RemoveContainer" containerID="65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.701315 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kt9vc"] Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.710653 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kt9vc"] Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.742738 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7b9p"] Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.747455 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-w7b9p"] Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.752996 4695 scope.go:117] "RemoveContainer" containerID="1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.769472 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-5vxrg"] Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.774494 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.790821 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.806008 4695 scope.go:117] "RemoveContainer" containerID="913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87" Mar 20 11:01:24 crc kubenswrapper[4695]: E0320 11:01:24.808302 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87\": container with ID starting with 913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87 not found: ID does not exist" containerID="913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.808354 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87"} err="failed to get container status \"913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87\": rpc error: code = NotFound desc = could not find container \"913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87\": container with ID starting with 913ce7560cbc38d6772f461d47e6929975407b292ec0b623014e42802efa6f87 not found: ID does not exist" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.808391 4695 scope.go:117] "RemoveContainer" containerID="65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8" Mar 20 11:01:24 crc kubenswrapper[4695]: E0320 11:01:24.812488 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8\": container with ID starting with 65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8 not found: ID does not exist" containerID="65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.812540 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8"} err="failed to get container status \"65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8\": rpc error: code = NotFound desc = could not find container \"65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8\": container with ID starting with 65a4d61eb9a50322c9f027e48ec99783c57db114dbcb7863ad9010c2a35005a8 not found: ID does not exist" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.812584 4695 scope.go:117] "RemoveContainer" containerID="1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008" Mar 20 11:01:24 crc kubenswrapper[4695]: E0320 11:01:24.813163 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008\": container with ID starting with 1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008 not found: ID does not exist" containerID="1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.813191 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008"} err="failed to get container status \"1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008\": rpc error: code = NotFound desc = could not find container \"1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008\": container with ID starting with 1520aa3c3e7898f956303e00d1fb82486c0a621e77e60faf3412dff24393e008 not found: ID does not exist" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.813206 4695 scope.go:117] "RemoveContainer" containerID="ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.837069 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-utilities\") pod \"da83bf65-5995-41cf-8f79-98a77e0ace2e\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.839471 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-catalog-content\") pod \"da83bf65-5995-41cf-8f79-98a77e0ace2e\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.839563 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcph\" (UniqueName: \"kubernetes.io/projected/da83bf65-5995-41cf-8f79-98a77e0ace2e-kube-api-access-rxcph\") pod \"da83bf65-5995-41cf-8f79-98a77e0ace2e\" (UID: \"da83bf65-5995-41cf-8f79-98a77e0ace2e\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.841877 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-utilities" (OuterVolumeSpecName: "utilities") pod "da83bf65-5995-41cf-8f79-98a77e0ace2e" (UID: "da83bf65-5995-41cf-8f79-98a77e0ace2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.846244 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da83bf65-5995-41cf-8f79-98a77e0ace2e-kube-api-access-rxcph" (OuterVolumeSpecName: "kube-api-access-rxcph") pod "da83bf65-5995-41cf-8f79-98a77e0ace2e" (UID: "da83bf65-5995-41cf-8f79-98a77e0ace2e"). InnerVolumeSpecName "kube-api-access-rxcph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.873061 4695 scope.go:117] "RemoveContainer" containerID="ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467" Mar 20 11:01:24 crc kubenswrapper[4695]: E0320 11:01:24.876685 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467\": container with ID starting with ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467 not found: ID does not exist" containerID="ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.876743 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467"} err="failed to get container status \"ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467\": rpc error: code = NotFound desc = could not find container \"ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467\": container with ID starting with ac2d388531bb8b769c88a3fa2ab2e475686eee8f5cd065da34078af139610467 not found: ID does not exist" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.898646 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36b62628-63ee-4520-8787-ce943f478c0b" path="/var/lib/kubelet/pods/36b62628-63ee-4520-8787-ce943f478c0b/volumes" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.899321 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" path="/var/lib/kubelet/pods/53830966-0b62-40fe-9f81-c18c95ea50aa/volumes" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.901004 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da83bf65-5995-41cf-8f79-98a77e0ace2e" (UID: "da83bf65-5995-41cf-8f79-98a77e0ace2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941222 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-catalog-content\") pod \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941302 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-utilities\") pod \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941361 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-utilities\") pod \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941401 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6786\" (UniqueName: \"kubernetes.io/projected/e96190cb-8d03-4cb7-b3f6-6b46a141f969-kube-api-access-w6786\") pod \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941445 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stks6\" (UniqueName: \"kubernetes.io/projected/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-kube-api-access-stks6\") pod \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\" (UID: \"0b5f000a-cdbc-486a-9e77-d3bf68046cb7\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941534 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-catalog-content\") pod \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\" (UID: \"e96190cb-8d03-4cb7-b3f6-6b46a141f969\") " Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941811 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941828 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcph\" (UniqueName: \"kubernetes.io/projected/da83bf65-5995-41cf-8f79-98a77e0ace2e-kube-api-access-rxcph\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.941842 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da83bf65-5995-41cf-8f79-98a77e0ace2e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.942638 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-utilities" (OuterVolumeSpecName: "utilities") pod "e96190cb-8d03-4cb7-b3f6-6b46a141f969" (UID: "e96190cb-8d03-4cb7-b3f6-6b46a141f969"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.943290 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-utilities" (OuterVolumeSpecName: "utilities") pod "0b5f000a-cdbc-486a-9e77-d3bf68046cb7" (UID: "0b5f000a-cdbc-486a-9e77-d3bf68046cb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.945758 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-kube-api-access-stks6" (OuterVolumeSpecName: "kube-api-access-stks6") pod "0b5f000a-cdbc-486a-9e77-d3bf68046cb7" (UID: "0b5f000a-cdbc-486a-9e77-d3bf68046cb7"). InnerVolumeSpecName "kube-api-access-stks6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.946502 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e96190cb-8d03-4cb7-b3f6-6b46a141f969-kube-api-access-w6786" (OuterVolumeSpecName: "kube-api-access-w6786") pod "e96190cb-8d03-4cb7-b3f6-6b46a141f969" (UID: "e96190cb-8d03-4cb7-b3f6-6b46a141f969"). InnerVolumeSpecName "kube-api-access-w6786". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:24 crc kubenswrapper[4695]: I0320 11:01:24.969415 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b5f000a-cdbc-486a-9e77-d3bf68046cb7" (UID: "0b5f000a-cdbc-486a-9e77-d3bf68046cb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.002039 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e96190cb-8d03-4cb7-b3f6-6b46a141f969" (UID: "e96190cb-8d03-4cb7-b3f6-6b46a141f969"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.043532 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6786\" (UniqueName: \"kubernetes.io/projected/e96190cb-8d03-4cb7-b3f6-6b46a141f969-kube-api-access-w6786\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.043867 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stks6\" (UniqueName: \"kubernetes.io/projected/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-kube-api-access-stks6\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.044341 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.044362 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e96190cb-8d03-4cb7-b3f6-6b46a141f969-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.044377 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.044396 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b5f000a-cdbc-486a-9e77-d3bf68046cb7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.665288 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" event={"ID":"ec6419ea-4cf9-415f-8aba-c775cd497980","Type":"ContainerStarted","Data":"bca11e47796886521eb2b901d51f1536b936066a688518fab92c8be7dcca72c6"} Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.665354 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" event={"ID":"ec6419ea-4cf9-415f-8aba-c775cd497980","Type":"ContainerStarted","Data":"f8d661279b746580f30c1a36ef5bad1a5ebd7747eec42a1f6b345bdd9b16f18e"} Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.665957 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.671753 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.682512 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mzrcb" event={"ID":"e96190cb-8d03-4cb7-b3f6-6b46a141f969","Type":"ContainerDied","Data":"33c1853559c370a3b02b5a1671bc00d072e82919a538828219fd73f2b0b08b6a"} Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.682609 4695 scope.go:117] "RemoveContainer" containerID="1d38591f7dfdb9068d761219345ed096498521f05df89266d75b9376745f3db4" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.682811 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mzrcb" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.692609 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-p78dm" event={"ID":"0b5f000a-cdbc-486a-9e77-d3bf68046cb7","Type":"ContainerDied","Data":"6460a31bf6b18d6925e842d6f4cbc96a04c81eacb7e0790e588090999bab2134"} Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.692788 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-p78dm" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.694832 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-5vxrg" podStartSLOduration=2.694782176 podStartE2EDuration="2.694782176s" podCreationTimestamp="2026-03-20 11:01:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:01:25.688035655 +0000 UTC m=+463.468641218" watchObservedRunningTime="2026-03-20 11:01:25.694782176 +0000 UTC m=+463.475387759" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.699461 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-28xd8" event={"ID":"da83bf65-5995-41cf-8f79-98a77e0ace2e","Type":"ContainerDied","Data":"7fb86a4ba9ddf2f8a7f072339b16e593c02e09aa56f102993852eff5d2b93840"} Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.699565 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-28xd8" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.705134 4695 scope.go:117] "RemoveContainer" containerID="a385b8dfa2b4cc8c33beb2702f2c9eb8e77c3930cc53a97e0a9560c3a40f0c2d" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.756151 4695 scope.go:117] "RemoveContainer" containerID="30067ec28f761750c474c359bf8e0ad6d8a2878f533a1534fdf10cb2d51357a0" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.756208 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mzrcb"] Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.779182 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mzrcb"] Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.787269 4695 scope.go:117] "RemoveContainer" containerID="a4d1fe52926db7a4a4059229f0e59594d06e2a09b3fca6a9eb924c27b8c410a4" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.807030 4695 scope.go:117] "RemoveContainer" containerID="050a44b4542238f8ce92766bf19917b2a2456b70bd029b400da70c7736b4ac32" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.810225 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-p78dm"] Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.820976 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-p78dm"] Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.828360 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-28xd8"] Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.833787 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-28xd8"] Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.834901 4695 scope.go:117] "RemoveContainer" containerID="751f477e1fd16b2abcde0fd1298428c0718ce2b773e9cbb6887e8322497094f9" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.860713 4695 scope.go:117] "RemoveContainer" containerID="e3e9fa14ffce9bf1b0980fe833d05fa119c8c722b051c1703109c5717ad5f9fe" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.881679 4695 scope.go:117] "RemoveContainer" containerID="489bdf505868738ef5d106284c55cc6da8b8deabd146e53f91dca8da7d9958a6" Mar 20 11:01:25 crc kubenswrapper[4695]: I0320 11:01:25.899376 4695 scope.go:117] "RemoveContainer" containerID="b07be4cac708dfa83396768af9a3ab98c1fc9719042388a003f699e72f5f187b" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.187876 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-865c5"] Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188299 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188323 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188335 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188341 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188355 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188362 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188374 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36b62628-63ee-4520-8787-ce943f478c0b" containerName="marketplace-operator" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188380 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="36b62628-63ee-4520-8787-ce943f478c0b" containerName="marketplace-operator" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188388 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188394 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188406 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188411 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188422 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188427 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="extract-utilities" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188436 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188442 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188449 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188455 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188464 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188470 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188481 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188487 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188494 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188501 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="extract-content" Mar 20 11:01:26 crc kubenswrapper[4695]: E0320 11:01:26.188510 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188515 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188650 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188667 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188691 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188704 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="36b62628-63ee-4520-8787-ce943f478c0b" containerName="marketplace-operator" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.188711 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="53830966-0b62-40fe-9f81-c18c95ea50aa" containerName="registry-server" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.189668 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.192568 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.209538 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-865c5"] Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.264878 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-catalog-content\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.264970 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-utilities\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.265012 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvm56\" (UniqueName: \"kubernetes.io/projected/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-kube-api-access-mvm56\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.366346 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-catalog-content\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.366412 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-utilities\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.366452 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvm56\" (UniqueName: \"kubernetes.io/projected/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-kube-api-access-mvm56\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.367135 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-catalog-content\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.367226 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-utilities\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.388124 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvm56\" (UniqueName: \"kubernetes.io/projected/2b4f2e1f-35d1-4755-9e68-9a57f30f9423-kube-api-access-mvm56\") pod \"redhat-operators-865c5\" (UID: \"2b4f2e1f-35d1-4755-9e68-9a57f30f9423\") " pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.511439 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.894788 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5f000a-cdbc-486a-9e77-d3bf68046cb7" path="/var/lib/kubelet/pods/0b5f000a-cdbc-486a-9e77-d3bf68046cb7/volumes" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.895705 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da83bf65-5995-41cf-8f79-98a77e0ace2e" path="/var/lib/kubelet/pods/da83bf65-5995-41cf-8f79-98a77e0ace2e/volumes" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.896544 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e96190cb-8d03-4cb7-b3f6-6b46a141f969" path="/var/lib/kubelet/pods/e96190cb-8d03-4cb7-b3f6-6b46a141f969/volumes" Mar 20 11:01:26 crc kubenswrapper[4695]: I0320 11:01:26.935868 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-865c5"] Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.727220 4695 generic.go:334] "Generic (PLEG): container finished" podID="2b4f2e1f-35d1-4755-9e68-9a57f30f9423" containerID="96565993e2c9abe3f3558fd13bd82fc77a8ab0e784ee228ebfc15fc46b14be75" exitCode=0 Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.727368 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865c5" event={"ID":"2b4f2e1f-35d1-4755-9e68-9a57f30f9423","Type":"ContainerDied","Data":"96565993e2c9abe3f3558fd13bd82fc77a8ab0e784ee228ebfc15fc46b14be75"} Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.727445 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865c5" event={"ID":"2b4f2e1f-35d1-4755-9e68-9a57f30f9423","Type":"ContainerStarted","Data":"3b636c649e0ea0ac70d05d5e4063b261a9057ff3a55fd618a8c4e45a213802a6"} Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.990031 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wd4q4"] Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.991926 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.994649 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.997445 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8wlt\" (UniqueName: \"kubernetes.io/projected/022dc941-0ef6-4062-9089-3b4fd1c2e404-kube-api-access-h8wlt\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.997524 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/022dc941-0ef6-4062-9089-3b4fd1c2e404-utilities\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:27 crc kubenswrapper[4695]: I0320 11:01:27.997572 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/022dc941-0ef6-4062-9089-3b4fd1c2e404-catalog-content\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.004877 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wd4q4"] Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.099229 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/022dc941-0ef6-4062-9089-3b4fd1c2e404-utilities\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.099303 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/022dc941-0ef6-4062-9089-3b4fd1c2e404-catalog-content\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.099363 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8wlt\" (UniqueName: \"kubernetes.io/projected/022dc941-0ef6-4062-9089-3b4fd1c2e404-kube-api-access-h8wlt\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.100565 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/022dc941-0ef6-4062-9089-3b4fd1c2e404-catalog-content\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.100688 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/022dc941-0ef6-4062-9089-3b4fd1c2e404-utilities\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.126976 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8wlt\" (UniqueName: \"kubernetes.io/projected/022dc941-0ef6-4062-9089-3b4fd1c2e404-kube-api-access-h8wlt\") pod \"certified-operators-wd4q4\" (UID: \"022dc941-0ef6-4062-9089-3b4fd1c2e404\") " pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.312294 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.602361 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t8f2c"] Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.614955 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.620953 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.636437 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzhfk\" (UniqueName: \"kubernetes.io/projected/1754a3b6-2691-4878-8fb1-38668a0e103a-kube-api-access-qzhfk\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.636528 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-catalog-content\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.636563 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-utilities\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.638573 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8f2c"] Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.671399 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wd4q4"] Mar 20 11:01:28 crc kubenswrapper[4695]: W0320 11:01:28.677965 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022dc941_0ef6_4062_9089_3b4fd1c2e404.slice/crio-e133f508f80597a6bfd56e14873aed622350a397a927cbc2d3c40edf4d7d23e1 WatchSource:0}: Error finding container e133f508f80597a6bfd56e14873aed622350a397a927cbc2d3c40edf4d7d23e1: Status 404 returned error can't find the container with id e133f508f80597a6bfd56e14873aed622350a397a927cbc2d3c40edf4d7d23e1 Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.735591 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wd4q4" event={"ID":"022dc941-0ef6-4062-9089-3b4fd1c2e404","Type":"ContainerStarted","Data":"e133f508f80597a6bfd56e14873aed622350a397a927cbc2d3c40edf4d7d23e1"} Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.737623 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzhfk\" (UniqueName: \"kubernetes.io/projected/1754a3b6-2691-4878-8fb1-38668a0e103a-kube-api-access-qzhfk\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.737694 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-catalog-content\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.737717 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-utilities\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.738316 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-utilities\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.738596 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-catalog-content\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.757626 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzhfk\" (UniqueName: \"kubernetes.io/projected/1754a3b6-2691-4878-8fb1-38668a0e103a-kube-api-access-qzhfk\") pod \"community-operators-t8f2c\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:28 crc kubenswrapper[4695]: I0320 11:01:28.942376 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.163012 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t8f2c"] Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.744951 4695 generic.go:334] "Generic (PLEG): container finished" podID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerID="9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d" exitCode=0 Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.745053 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8f2c" event={"ID":"1754a3b6-2691-4878-8fb1-38668a0e103a","Type":"ContainerDied","Data":"9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d"} Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.745103 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8f2c" event={"ID":"1754a3b6-2691-4878-8fb1-38668a0e103a","Type":"ContainerStarted","Data":"7300209331cb8844b86e6d0e64b696685454f3e578477a4223533ac8aec59c14"} Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.747533 4695 generic.go:334] "Generic (PLEG): container finished" podID="022dc941-0ef6-4062-9089-3b4fd1c2e404" containerID="5bcc93127897da04a0730068f078ee8398d893707c7522295655d414c5af6ea0" exitCode=0 Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.747602 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wd4q4" event={"ID":"022dc941-0ef6-4062-9089-3b4fd1c2e404","Type":"ContainerDied","Data":"5bcc93127897da04a0730068f078ee8398d893707c7522295655d414c5af6ea0"} Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.751584 4695 generic.go:334] "Generic (PLEG): container finished" podID="2b4f2e1f-35d1-4755-9e68-9a57f30f9423" containerID="6318671ef877b8c79ea1e1ac23aee02b4d0d8b12c59ca42793c4325aec347c63" exitCode=0 Mar 20 11:01:29 crc kubenswrapper[4695]: I0320 11:01:29.751645 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865c5" event={"ID":"2b4f2e1f-35d1-4755-9e68-9a57f30f9423","Type":"ContainerDied","Data":"6318671ef877b8c79ea1e1ac23aee02b4d0d8b12c59ca42793c4325aec347c63"} Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.397509 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-slr2w"] Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.400508 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.404667 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.405413 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-slr2w"] Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.579727 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vjhg\" (UniqueName: \"kubernetes.io/projected/c0a73066-5eca-4060-b323-4f8bf64b4a6c-kube-api-access-5vjhg\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.579807 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a73066-5eca-4060-b323-4f8bf64b4a6c-utilities\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.579876 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a73066-5eca-4060-b323-4f8bf64b4a6c-catalog-content\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.692932 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a73066-5eca-4060-b323-4f8bf64b4a6c-utilities\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.693018 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a73066-5eca-4060-b323-4f8bf64b4a6c-catalog-content\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.693090 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vjhg\" (UniqueName: \"kubernetes.io/projected/c0a73066-5eca-4060-b323-4f8bf64b4a6c-kube-api-access-5vjhg\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.693992 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0a73066-5eca-4060-b323-4f8bf64b4a6c-utilities\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.694068 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0a73066-5eca-4060-b323-4f8bf64b4a6c-catalog-content\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.716985 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vjhg\" (UniqueName: \"kubernetes.io/projected/c0a73066-5eca-4060-b323-4f8bf64b4a6c-kube-api-access-5vjhg\") pod \"redhat-marketplace-slr2w\" (UID: \"c0a73066-5eca-4060-b323-4f8bf64b4a6c\") " pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.760866 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wd4q4" event={"ID":"022dc941-0ef6-4062-9089-3b4fd1c2e404","Type":"ContainerStarted","Data":"011e86f207034a0a0cebbac61cebf6d9491236125c3080056827dabbe0b60605"} Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.765303 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.769824 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-865c5" event={"ID":"2b4f2e1f-35d1-4755-9e68-9a57f30f9423","Type":"ContainerStarted","Data":"09195cc750e5b3241f560fd4476d6862620674e6ca6892a16caaeae2fee4b36a"} Mar 20 11:01:30 crc kubenswrapper[4695]: I0320 11:01:30.823781 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-865c5" podStartSLOduration=2.087632168 podStartE2EDuration="4.823718527s" podCreationTimestamp="2026-03-20 11:01:26 +0000 UTC" firstStartedPulling="2026-03-20 11:01:27.729506343 +0000 UTC m=+465.510111906" lastFinishedPulling="2026-03-20 11:01:30.465592702 +0000 UTC m=+468.246198265" observedRunningTime="2026-03-20 11:01:30.815713473 +0000 UTC m=+468.596319036" watchObservedRunningTime="2026-03-20 11:01:30.823718527 +0000 UTC m=+468.604324090" Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.056485 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-slr2w"] Mar 20 11:01:31 crc kubenswrapper[4695]: W0320 11:01:31.063794 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a73066_5eca_4060_b323_4f8bf64b4a6c.slice/crio-2b8b9ee16e159c101ef273952c7fffdf121a075bad5ea4e78fdc2a44deb58552 WatchSource:0}: Error finding container 2b8b9ee16e159c101ef273952c7fffdf121a075bad5ea4e78fdc2a44deb58552: Status 404 returned error can't find the container with id 2b8b9ee16e159c101ef273952c7fffdf121a075bad5ea4e78fdc2a44deb58552 Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.778730 4695 generic.go:334] "Generic (PLEG): container finished" podID="022dc941-0ef6-4062-9089-3b4fd1c2e404" containerID="011e86f207034a0a0cebbac61cebf6d9491236125c3080056827dabbe0b60605" exitCode=0 Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.778796 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wd4q4" event={"ID":"022dc941-0ef6-4062-9089-3b4fd1c2e404","Type":"ContainerDied","Data":"011e86f207034a0a0cebbac61cebf6d9491236125c3080056827dabbe0b60605"} Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.783032 4695 generic.go:334] "Generic (PLEG): container finished" podID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerID="a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7" exitCode=0 Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.783104 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8f2c" event={"ID":"1754a3b6-2691-4878-8fb1-38668a0e103a","Type":"ContainerDied","Data":"a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7"} Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.785410 4695 generic.go:334] "Generic (PLEG): container finished" podID="c0a73066-5eca-4060-b323-4f8bf64b4a6c" containerID="70bc9df7ad0dba4dd10bff361d1f714b218c3eccbd4b4367c5b9284b1dc8170e" exitCode=0 Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.785882 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slr2w" event={"ID":"c0a73066-5eca-4060-b323-4f8bf64b4a6c","Type":"ContainerDied","Data":"70bc9df7ad0dba4dd10bff361d1f714b218c3eccbd4b4367c5b9284b1dc8170e"} Mar 20 11:01:31 crc kubenswrapper[4695]: I0320 11:01:31.785959 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slr2w" event={"ID":"c0a73066-5eca-4060-b323-4f8bf64b4a6c","Type":"ContainerStarted","Data":"2b8b9ee16e159c101ef273952c7fffdf121a075bad5ea4e78fdc2a44deb58552"} Mar 20 11:01:32 crc kubenswrapper[4695]: I0320 11:01:32.795297 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slr2w" event={"ID":"c0a73066-5eca-4060-b323-4f8bf64b4a6c","Type":"ContainerStarted","Data":"34559ab65c3702881aca670f5db074315811f5a208aa9463943adcc457cf51a2"} Mar 20 11:01:32 crc kubenswrapper[4695]: I0320 11:01:32.799436 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wd4q4" event={"ID":"022dc941-0ef6-4062-9089-3b4fd1c2e404","Type":"ContainerStarted","Data":"136772251dbe1e2635b4b18874e981b6215f5a25617c33a937c81a63313244a8"} Mar 20 11:01:32 crc kubenswrapper[4695]: I0320 11:01:32.804146 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8f2c" event={"ID":"1754a3b6-2691-4878-8fb1-38668a0e103a","Type":"ContainerStarted","Data":"e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc"} Mar 20 11:01:32 crc kubenswrapper[4695]: I0320 11:01:32.868777 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wd4q4" podStartSLOduration=3.365412886 podStartE2EDuration="5.868753277s" podCreationTimestamp="2026-03-20 11:01:27 +0000 UTC" firstStartedPulling="2026-03-20 11:01:29.748971714 +0000 UTC m=+467.529577277" lastFinishedPulling="2026-03-20 11:01:32.252312105 +0000 UTC m=+470.032917668" observedRunningTime="2026-03-20 11:01:32.845248019 +0000 UTC m=+470.625853582" watchObservedRunningTime="2026-03-20 11:01:32.868753277 +0000 UTC m=+470.649358840" Mar 20 11:01:32 crc kubenswrapper[4695]: I0320 11:01:32.871746 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t8f2c" podStartSLOduration=2.390473922 podStartE2EDuration="4.871730382s" podCreationTimestamp="2026-03-20 11:01:28 +0000 UTC" firstStartedPulling="2026-03-20 11:01:29.747926587 +0000 UTC m=+467.528532150" lastFinishedPulling="2026-03-20 11:01:32.229183047 +0000 UTC m=+470.009788610" observedRunningTime="2026-03-20 11:01:32.865942665 +0000 UTC m=+470.646548228" watchObservedRunningTime="2026-03-20 11:01:32.871730382 +0000 UTC m=+470.652335945" Mar 20 11:01:33 crc kubenswrapper[4695]: I0320 11:01:33.830426 4695 generic.go:334] "Generic (PLEG): container finished" podID="c0a73066-5eca-4060-b323-4f8bf64b4a6c" containerID="34559ab65c3702881aca670f5db074315811f5a208aa9463943adcc457cf51a2" exitCode=0 Mar 20 11:01:33 crc kubenswrapper[4695]: I0320 11:01:33.830531 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slr2w" event={"ID":"c0a73066-5eca-4060-b323-4f8bf64b4a6c","Type":"ContainerDied","Data":"34559ab65c3702881aca670f5db074315811f5a208aa9463943adcc457cf51a2"} Mar 20 11:01:34 crc kubenswrapper[4695]: I0320 11:01:34.840661 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-slr2w" event={"ID":"c0a73066-5eca-4060-b323-4f8bf64b4a6c","Type":"ContainerStarted","Data":"427c4714d89d390b4913efa20309cc239220abcda1c461b6a559055cb0f9ffc1"} Mar 20 11:01:34 crc kubenswrapper[4695]: I0320 11:01:34.876388 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-slr2w" podStartSLOduration=2.364705251 podStartE2EDuration="4.876358354s" podCreationTimestamp="2026-03-20 11:01:30 +0000 UTC" firstStartedPulling="2026-03-20 11:01:31.787501208 +0000 UTC m=+469.568106811" lastFinishedPulling="2026-03-20 11:01:34.299154361 +0000 UTC m=+472.079759914" observedRunningTime="2026-03-20 11:01:34.872704382 +0000 UTC m=+472.653309955" watchObservedRunningTime="2026-03-20 11:01:34.876358354 +0000 UTC m=+472.656963917" Mar 20 11:01:36 crc kubenswrapper[4695]: I0320 11:01:36.512303 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:36 crc kubenswrapper[4695]: I0320 11:01:36.512854 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:37 crc kubenswrapper[4695]: I0320 11:01:37.557604 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-865c5" podUID="2b4f2e1f-35d1-4755-9e68-9a57f30f9423" containerName="registry-server" probeResult="failure" output=< Mar 20 11:01:37 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 11:01:37 crc kubenswrapper[4695]: > Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.204234 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.312460 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.313162 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.370297 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.430892 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.431384 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.431840 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.433239 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ec82d0faf7e4964b116a15802965ff707a29f15669e56ffaba0e38d32bd99a78"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.433338 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://ec82d0faf7e4964b116a15802965ff707a29f15669e56ffaba0e38d32bd99a78" gracePeriod=600 Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.867856 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="ec82d0faf7e4964b116a15802965ff707a29f15669e56ffaba0e38d32bd99a78" exitCode=0 Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.867963 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"ec82d0faf7e4964b116a15802965ff707a29f15669e56ffaba0e38d32bd99a78"} Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.868033 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"69869f4a2adf9f245570146a3380a9dd7993c55d9e0096095daece76387fad83"} Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.868060 4695 scope.go:117] "RemoveContainer" containerID="f52c3cc7c395c498c816cd540172b9c782623535c14aff204ea0efa08008cef3" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.928145 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wd4q4" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.944096 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:38 crc kubenswrapper[4695]: I0320 11:01:38.944835 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:39 crc kubenswrapper[4695]: I0320 11:01:39.005710 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:39 crc kubenswrapper[4695]: I0320 11:01:39.930490 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:01:40 crc kubenswrapper[4695]: I0320 11:01:40.766832 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:40 crc kubenswrapper[4695]: I0320 11:01:40.766931 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:40 crc kubenswrapper[4695]: I0320 11:01:40.816785 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:40 crc kubenswrapper[4695]: I0320 11:01:40.935571 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-slr2w" Mar 20 11:01:46 crc kubenswrapper[4695]: I0320 11:01:46.564983 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:46 crc kubenswrapper[4695]: I0320 11:01:46.607029 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-865c5" Mar 20 11:01:47 crc kubenswrapper[4695]: I0320 11:01:47.551932 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" podUID="58b17661-5628-4e68-aa9e-9d6e850b6dbe" containerName="registry" containerID="cri-o://2209a6f1b0614ea5a119dcb626ef7ee51ef1f1b8dfc4210749065afcfe358e6c" gracePeriod=30 Mar 20 11:01:47 crc kubenswrapper[4695]: I0320 11:01:47.953065 4695 generic.go:334] "Generic (PLEG): container finished" podID="58b17661-5628-4e68-aa9e-9d6e850b6dbe" containerID="2209a6f1b0614ea5a119dcb626ef7ee51ef1f1b8dfc4210749065afcfe358e6c" exitCode=0 Mar 20 11:01:47 crc kubenswrapper[4695]: I0320 11:01:47.953164 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" event={"ID":"58b17661-5628-4e68-aa9e-9d6e850b6dbe","Type":"ContainerDied","Data":"2209a6f1b0614ea5a119dcb626ef7ee51ef1f1b8dfc4210749065afcfe358e6c"} Mar 20 11:01:47 crc kubenswrapper[4695]: I0320 11:01:47.953468 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" event={"ID":"58b17661-5628-4e68-aa9e-9d6e850b6dbe","Type":"ContainerDied","Data":"1e918bf12b87f426964b59fc38aacd1dcc97b3e1020893f334bff29cf12fdd38"} Mar 20 11:01:47 crc kubenswrapper[4695]: I0320 11:01:47.953483 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e918bf12b87f426964b59fc38aacd1dcc97b3e1020893f334bff29cf12fdd38" Mar 20 11:01:47 crc kubenswrapper[4695]: I0320 11:01:47.972089 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.076681 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58b17661-5628-4e68-aa9e-9d6e850b6dbe-ca-trust-extracted\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077204 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-certificates\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077252 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58b17661-5628-4e68-aa9e-9d6e850b6dbe-installation-pull-secrets\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077284 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8ndf\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-kube-api-access-r8ndf\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077484 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077611 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-trusted-ca\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077637 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-tls\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.077707 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-bound-sa-token\") pod \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\" (UID: \"58b17661-5628-4e68-aa9e-9d6e850b6dbe\") " Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.079427 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.080587 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.092820 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.093112 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b17661-5628-4e68-aa9e-9d6e850b6dbe-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.093201 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.093747 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-kube-api-access-r8ndf" (OuterVolumeSpecName: "kube-api-access-r8ndf") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "kube-api-access-r8ndf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.096796 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58b17661-5628-4e68-aa9e-9d6e850b6dbe-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.102784 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "58b17661-5628-4e68-aa9e-9d6e850b6dbe" (UID: "58b17661-5628-4e68-aa9e-9d6e850b6dbe"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179433 4695 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179471 4695 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/58b17661-5628-4e68-aa9e-9d6e850b6dbe-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179481 4695 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179494 4695 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/58b17661-5628-4e68-aa9e-9d6e850b6dbe-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179503 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8ndf\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-kube-api-access-r8ndf\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179513 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/58b17661-5628-4e68-aa9e-9d6e850b6dbe-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.179522 4695 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/58b17661-5628-4e68-aa9e-9d6e850b6dbe-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.970804 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-h8rbk" Mar 20 11:01:48 crc kubenswrapper[4695]: I0320 11:01:48.997424 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h8rbk"] Mar 20 11:01:49 crc kubenswrapper[4695]: I0320 11:01:49.003067 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-h8rbk"] Mar 20 11:01:50 crc kubenswrapper[4695]: I0320 11:01:50.894694 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58b17661-5628-4e68-aa9e-9d6e850b6dbe" path="/var/lib/kubelet/pods/58b17661-5628-4e68-aa9e-9d6e850b6dbe/volumes" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.146293 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566742-m6jqc"] Mar 20 11:02:00 crc kubenswrapper[4695]: E0320 11:02:00.147332 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58b17661-5628-4e68-aa9e-9d6e850b6dbe" containerName="registry" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.147352 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="58b17661-5628-4e68-aa9e-9d6e850b6dbe" containerName="registry" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.147472 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="58b17661-5628-4e68-aa9e-9d6e850b6dbe" containerName="registry" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.147996 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.151959 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-m6jqc"] Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.153556 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.153955 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.154003 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.245670 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4svx\" (UniqueName: \"kubernetes.io/projected/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a-kube-api-access-g4svx\") pod \"auto-csr-approver-29566742-m6jqc\" (UID: \"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a\") " pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.347177 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4svx\" (UniqueName: \"kubernetes.io/projected/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a-kube-api-access-g4svx\") pod \"auto-csr-approver-29566742-m6jqc\" (UID: \"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a\") " pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.369218 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4svx\" (UniqueName: \"kubernetes.io/projected/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a-kube-api-access-g4svx\") pod \"auto-csr-approver-29566742-m6jqc\" (UID: \"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a\") " pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.477064 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:00 crc kubenswrapper[4695]: I0320 11:02:00.679225 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-m6jqc"] Mar 20 11:02:01 crc kubenswrapper[4695]: I0320 11:02:01.064105 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" event={"ID":"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a","Type":"ContainerStarted","Data":"adc5bf66708f387b07ec054b16abcc7333501d52ac3f0397e5bf082c81c1f6d4"} Mar 20 11:02:02 crc kubenswrapper[4695]: I0320 11:02:02.073060 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" event={"ID":"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a","Type":"ContainerStarted","Data":"6f14d203e0a09c15d5b46a0a38c7e1ae507907bfda3952dede2c47d6c81cc366"} Mar 20 11:02:02 crc kubenswrapper[4695]: I0320 11:02:02.093059 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" podStartSLOduration=0.959966647 podStartE2EDuration="2.093029383s" podCreationTimestamp="2026-03-20 11:02:00 +0000 UTC" firstStartedPulling="2026-03-20 11:02:00.684639458 +0000 UTC m=+498.465245021" lastFinishedPulling="2026-03-20 11:02:01.817702194 +0000 UTC m=+499.598307757" observedRunningTime="2026-03-20 11:02:02.088824946 +0000 UTC m=+499.869430519" watchObservedRunningTime="2026-03-20 11:02:02.093029383 +0000 UTC m=+499.873634946" Mar 20 11:02:03 crc kubenswrapper[4695]: I0320 11:02:03.081963 4695 generic.go:334] "Generic (PLEG): container finished" podID="b7961256-fd7f-48a0-ab69-bc6c6bb0f37a" containerID="6f14d203e0a09c15d5b46a0a38c7e1ae507907bfda3952dede2c47d6c81cc366" exitCode=0 Mar 20 11:02:03 crc kubenswrapper[4695]: I0320 11:02:03.082024 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" event={"ID":"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a","Type":"ContainerDied","Data":"6f14d203e0a09c15d5b46a0a38c7e1ae507907bfda3952dede2c47d6c81cc366"} Mar 20 11:02:04 crc kubenswrapper[4695]: I0320 11:02:04.312922 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:04 crc kubenswrapper[4695]: I0320 11:02:04.404501 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4svx\" (UniqueName: \"kubernetes.io/projected/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a-kube-api-access-g4svx\") pod \"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a\" (UID: \"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a\") " Mar 20 11:02:04 crc kubenswrapper[4695]: I0320 11:02:04.411720 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a-kube-api-access-g4svx" (OuterVolumeSpecName: "kube-api-access-g4svx") pod "b7961256-fd7f-48a0-ab69-bc6c6bb0f37a" (UID: "b7961256-fd7f-48a0-ab69-bc6c6bb0f37a"). InnerVolumeSpecName "kube-api-access-g4svx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:04 crc kubenswrapper[4695]: I0320 11:02:04.505877 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4svx\" (UniqueName: \"kubernetes.io/projected/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a-kube-api-access-g4svx\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4695]: I0320 11:02:05.098244 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" event={"ID":"b7961256-fd7f-48a0-ab69-bc6c6bb0f37a","Type":"ContainerDied","Data":"adc5bf66708f387b07ec054b16abcc7333501d52ac3f0397e5bf082c81c1f6d4"} Mar 20 11:02:05 crc kubenswrapper[4695]: I0320 11:02:05.098296 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc5bf66708f387b07ec054b16abcc7333501d52ac3f0397e5bf082c81c1f6d4" Mar 20 11:02:05 crc kubenswrapper[4695]: I0320 11:02:05.098331 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-m6jqc" Mar 20 11:02:05 crc kubenswrapper[4695]: I0320 11:02:05.151886 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-t45k7"] Mar 20 11:02:05 crc kubenswrapper[4695]: I0320 11:02:05.155376 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566736-t45k7"] Mar 20 11:02:06 crc kubenswrapper[4695]: I0320 11:02:06.895765 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5" path="/var/lib/kubelet/pods/b1ec0c6c-21bf-4b5f-b973-fd68c0d1c0f5/volumes" Mar 20 11:03:38 crc kubenswrapper[4695]: I0320 11:03:38.430559 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:03:38 crc kubenswrapper[4695]: I0320 11:03:38.431424 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.139570 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566744-wbhsz"] Mar 20 11:04:00 crc kubenswrapper[4695]: E0320 11:04:00.140830 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7961256-fd7f-48a0-ab69-bc6c6bb0f37a" containerName="oc" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.140851 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7961256-fd7f-48a0-ab69-bc6c6bb0f37a" containerName="oc" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.141060 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7961256-fd7f-48a0-ab69-bc6c6bb0f37a" containerName="oc" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.141719 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.145053 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.145049 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.147398 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.150757 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-wbhsz"] Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.235548 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzgg\" (UniqueName: \"kubernetes.io/projected/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96-kube-api-access-vzzgg\") pod \"auto-csr-approver-29566744-wbhsz\" (UID: \"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96\") " pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.336667 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzgg\" (UniqueName: \"kubernetes.io/projected/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96-kube-api-access-vzzgg\") pod \"auto-csr-approver-29566744-wbhsz\" (UID: \"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96\") " pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.359985 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzgg\" (UniqueName: \"kubernetes.io/projected/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96-kube-api-access-vzzgg\") pod \"auto-csr-approver-29566744-wbhsz\" (UID: \"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96\") " pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.461339 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.873674 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-wbhsz"] Mar 20 11:04:00 crc kubenswrapper[4695]: I0320 11:04:00.884419 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:04:01 crc kubenswrapper[4695]: I0320 11:04:01.836205 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" event={"ID":"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96","Type":"ContainerStarted","Data":"a79aeaab758fc7f63a83b350942b5df3c19d02517b17f33a001c11e3bab19230"} Mar 20 11:04:02 crc kubenswrapper[4695]: I0320 11:04:02.845357 4695 generic.go:334] "Generic (PLEG): container finished" podID="2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96" containerID="a8038cd44836d910d0a4138dedd63290d21948cd96d3228b49de0b271f34177b" exitCode=0 Mar 20 11:04:02 crc kubenswrapper[4695]: I0320 11:04:02.845485 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" event={"ID":"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96","Type":"ContainerDied","Data":"a8038cd44836d910d0a4138dedd63290d21948cd96d3228b49de0b271f34177b"} Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.079635 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.192001 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzzgg\" (UniqueName: \"kubernetes.io/projected/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96-kube-api-access-vzzgg\") pod \"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96\" (UID: \"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96\") " Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.199049 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96-kube-api-access-vzzgg" (OuterVolumeSpecName: "kube-api-access-vzzgg") pod "2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96" (UID: "2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96"). InnerVolumeSpecName "kube-api-access-vzzgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.293761 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzzgg\" (UniqueName: \"kubernetes.io/projected/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96-kube-api-access-vzzgg\") on node \"crc\" DevicePath \"\"" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.827029 4695 scope.go:117] "RemoveContainer" containerID="c5b6b458e709a1e714860e623ac759f65ec98325a44534a62d0a4dac8d0b9b88" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.851581 4695 scope.go:117] "RemoveContainer" containerID="2209a6f1b0614ea5a119dcb626ef7ee51ef1f1b8dfc4210749065afcfe358e6c" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.864652 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" event={"ID":"2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96","Type":"ContainerDied","Data":"a79aeaab758fc7f63a83b350942b5df3c19d02517b17f33a001c11e3bab19230"} Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.864706 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a79aeaab758fc7f63a83b350942b5df3c19d02517b17f33a001c11e3bab19230" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.864730 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-wbhsz" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.870694 4695 scope.go:117] "RemoveContainer" containerID="a247a8936badba2d41e89bf70a0fb81e27b8913627d323fbad080ef85e800894" Mar 20 11:04:04 crc kubenswrapper[4695]: I0320 11:04:04.889226 4695 scope.go:117] "RemoveContainer" containerID="c8ba2e02d27a4cbab7995f8c08ba6c5274fec83b92720fa385863b167108d67e" Mar 20 11:04:05 crc kubenswrapper[4695]: I0320 11:04:05.145181 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-dkbr4"] Mar 20 11:04:05 crc kubenswrapper[4695]: I0320 11:04:05.148987 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-dkbr4"] Mar 20 11:04:06 crc kubenswrapper[4695]: I0320 11:04:06.895542 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3280710-4b29-43c1-8f83-4c18670a9e0a" path="/var/lib/kubelet/pods/e3280710-4b29-43c1-8f83-4c18670a9e0a/volumes" Mar 20 11:04:08 crc kubenswrapper[4695]: I0320 11:04:08.431417 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:04:08 crc kubenswrapper[4695]: I0320 11:04:08.431502 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:04:38 crc kubenswrapper[4695]: I0320 11:04:38.430840 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:04:38 crc kubenswrapper[4695]: I0320 11:04:38.431716 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:04:38 crc kubenswrapper[4695]: I0320 11:04:38.431799 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:04:38 crc kubenswrapper[4695]: I0320 11:04:38.432765 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69869f4a2adf9f245570146a3380a9dd7993c55d9e0096095daece76387fad83"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:04:38 crc kubenswrapper[4695]: I0320 11:04:38.432859 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://69869f4a2adf9f245570146a3380a9dd7993c55d9e0096095daece76387fad83" gracePeriod=600 Mar 20 11:04:39 crc kubenswrapper[4695]: I0320 11:04:39.080035 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="69869f4a2adf9f245570146a3380a9dd7993c55d9e0096095daece76387fad83" exitCode=0 Mar 20 11:04:39 crc kubenswrapper[4695]: I0320 11:04:39.080124 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"69869f4a2adf9f245570146a3380a9dd7993c55d9e0096095daece76387fad83"} Mar 20 11:04:39 crc kubenswrapper[4695]: I0320 11:04:39.080779 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"2df2ef181c2d99312310276228ee6486ca56cb58973b41cd3d1cfa930619b521"} Mar 20 11:04:39 crc kubenswrapper[4695]: I0320 11:04:39.080810 4695 scope.go:117] "RemoveContainer" containerID="ec82d0faf7e4964b116a15802965ff707a29f15669e56ffaba0e38d32bd99a78" Mar 20 11:05:04 crc kubenswrapper[4695]: I0320 11:05:04.950296 4695 scope.go:117] "RemoveContainer" containerID="46b2e745e6b9b201ca52ce5404c6ab2af4f5a866aed62740ab55e0a9e5f394ed" Mar 20 11:05:04 crc kubenswrapper[4695]: I0320 11:05:04.990779 4695 scope.go:117] "RemoveContainer" containerID="da190adf44ad8ab38b9bfd7c46ac1ac8c52e4a985325a6af23379c5273f634d5" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.142606 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566746-x62p7"] Mar 20 11:06:00 crc kubenswrapper[4695]: E0320 11:06:00.143766 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.143783 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.143928 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.144495 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.146679 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.147310 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.147344 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.147361 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-x62p7"] Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.271710 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-787t9\" (UniqueName: \"kubernetes.io/projected/1891389e-4ca3-4ba6-9eff-b2ee13a43180-kube-api-access-787t9\") pod \"auto-csr-approver-29566746-x62p7\" (UID: \"1891389e-4ca3-4ba6-9eff-b2ee13a43180\") " pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.372672 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-787t9\" (UniqueName: \"kubernetes.io/projected/1891389e-4ca3-4ba6-9eff-b2ee13a43180-kube-api-access-787t9\") pod \"auto-csr-approver-29566746-x62p7\" (UID: \"1891389e-4ca3-4ba6-9eff-b2ee13a43180\") " pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.395779 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-787t9\" (UniqueName: \"kubernetes.io/projected/1891389e-4ca3-4ba6-9eff-b2ee13a43180-kube-api-access-787t9\") pod \"auto-csr-approver-29566746-x62p7\" (UID: \"1891389e-4ca3-4ba6-9eff-b2ee13a43180\") " pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.466667 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:00 crc kubenswrapper[4695]: I0320 11:06:00.689626 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-x62p7"] Mar 20 11:06:01 crc kubenswrapper[4695]: I0320 11:06:01.558533 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-x62p7" event={"ID":"1891389e-4ca3-4ba6-9eff-b2ee13a43180","Type":"ContainerStarted","Data":"242c44a767c6626d9cdc096d6221eb12bddeafad73eddfcb19820323a61dc752"} Mar 20 11:06:02 crc kubenswrapper[4695]: I0320 11:06:02.568181 4695 generic.go:334] "Generic (PLEG): container finished" podID="1891389e-4ca3-4ba6-9eff-b2ee13a43180" containerID="319e51e37f6eb537ce8bc45a7d54556e4719959f61059a83f46a20d3656f8dd9" exitCode=0 Mar 20 11:06:02 crc kubenswrapper[4695]: I0320 11:06:02.568250 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-x62p7" event={"ID":"1891389e-4ca3-4ba6-9eff-b2ee13a43180","Type":"ContainerDied","Data":"319e51e37f6eb537ce8bc45a7d54556e4719959f61059a83f46a20d3656f8dd9"} Mar 20 11:06:03 crc kubenswrapper[4695]: I0320 11:06:03.808771 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:03 crc kubenswrapper[4695]: I0320 11:06:03.929163 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-787t9\" (UniqueName: \"kubernetes.io/projected/1891389e-4ca3-4ba6-9eff-b2ee13a43180-kube-api-access-787t9\") pod \"1891389e-4ca3-4ba6-9eff-b2ee13a43180\" (UID: \"1891389e-4ca3-4ba6-9eff-b2ee13a43180\") " Mar 20 11:06:03 crc kubenswrapper[4695]: I0320 11:06:03.936656 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1891389e-4ca3-4ba6-9eff-b2ee13a43180-kube-api-access-787t9" (OuterVolumeSpecName: "kube-api-access-787t9") pod "1891389e-4ca3-4ba6-9eff-b2ee13a43180" (UID: "1891389e-4ca3-4ba6-9eff-b2ee13a43180"). InnerVolumeSpecName "kube-api-access-787t9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:06:04 crc kubenswrapper[4695]: I0320 11:06:04.031088 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-787t9\" (UniqueName: \"kubernetes.io/projected/1891389e-4ca3-4ba6-9eff-b2ee13a43180-kube-api-access-787t9\") on node \"crc\" DevicePath \"\"" Mar 20 11:06:04 crc kubenswrapper[4695]: I0320 11:06:04.581652 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-x62p7" event={"ID":"1891389e-4ca3-4ba6-9eff-b2ee13a43180","Type":"ContainerDied","Data":"242c44a767c6626d9cdc096d6221eb12bddeafad73eddfcb19820323a61dc752"} Mar 20 11:06:04 crc kubenswrapper[4695]: I0320 11:06:04.582185 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="242c44a767c6626d9cdc096d6221eb12bddeafad73eddfcb19820323a61dc752" Mar 20 11:06:04 crc kubenswrapper[4695]: I0320 11:06:04.581727 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-x62p7" Mar 20 11:06:04 crc kubenswrapper[4695]: I0320 11:06:04.882815 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-wbqkd"] Mar 20 11:06:04 crc kubenswrapper[4695]: I0320 11:06:04.895823 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-wbqkd"] Mar 20 11:06:06 crc kubenswrapper[4695]: I0320 11:06:06.902524 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a27fd160-2f93-4820-b5cf-566db0042594" path="/var/lib/kubelet/pods/a27fd160-2f93-4820-b5cf-566db0042594/volumes" Mar 20 11:06:38 crc kubenswrapper[4695]: I0320 11:06:38.430752 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:06:38 crc kubenswrapper[4695]: I0320 11:06:38.431560 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:07:05 crc kubenswrapper[4695]: I0320 11:07:05.059013 4695 scope.go:117] "RemoveContainer" containerID="6945876e53842900f6e1cdedca0308f6f7034a64d87aab98548ef4d7d5c67e89" Mar 20 11:07:08 crc kubenswrapper[4695]: I0320 11:07:08.430339 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:07:08 crc kubenswrapper[4695]: I0320 11:07:08.430756 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:07:26 crc kubenswrapper[4695]: I0320 11:07:26.358932 4695 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 11:07:38 crc kubenswrapper[4695]: I0320 11:07:38.431642 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:07:38 crc kubenswrapper[4695]: I0320 11:07:38.432661 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:07:38 crc kubenswrapper[4695]: I0320 11:07:38.432739 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:07:38 crc kubenswrapper[4695]: I0320 11:07:38.433574 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2df2ef181c2d99312310276228ee6486ca56cb58973b41cd3d1cfa930619b521"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:07:38 crc kubenswrapper[4695]: I0320 11:07:38.433638 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://2df2ef181c2d99312310276228ee6486ca56cb58973b41cd3d1cfa930619b521" gracePeriod=600 Mar 20 11:07:39 crc kubenswrapper[4695]: I0320 11:07:39.176173 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="2df2ef181c2d99312310276228ee6486ca56cb58973b41cd3d1cfa930619b521" exitCode=0 Mar 20 11:07:39 crc kubenswrapper[4695]: I0320 11:07:39.176264 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"2df2ef181c2d99312310276228ee6486ca56cb58973b41cd3d1cfa930619b521"} Mar 20 11:07:39 crc kubenswrapper[4695]: I0320 11:07:39.176733 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"b7f145b88a381dab0a3af9969335c65e981dd0f3a0b106c999ecbed0c035eef5"} Mar 20 11:07:39 crc kubenswrapper[4695]: I0320 11:07:39.176768 4695 scope.go:117] "RemoveContainer" containerID="69869f4a2adf9f245570146a3380a9dd7993c55d9e0096095daece76387fad83" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.088553 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h"] Mar 20 11:07:43 crc kubenswrapper[4695]: E0320 11:07:43.089812 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1891389e-4ca3-4ba6-9eff-b2ee13a43180" containerName="oc" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.089835 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1891389e-4ca3-4ba6-9eff-b2ee13a43180" containerName="oc" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.090234 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="1891389e-4ca3-4ba6-9eff-b2ee13a43180" containerName="oc" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.090883 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.093743 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.093969 4695 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-qzrjg" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.095240 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.097286 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-5w452"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.098110 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-5w452" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.105791 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.107845 4695 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2kbzm" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.112840 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tgcp4"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.114069 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.123005 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-5w452"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.124832 4695 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-f6hkk" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.139304 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tgcp4"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.228383 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2xl6\" (UniqueName: \"kubernetes.io/projected/bbe529d1-f905-4175-b7e7-44aa52a9cfcf-kube-api-access-t2xl6\") pod \"cert-manager-858654f9db-5w452\" (UID: \"bbe529d1-f905-4175-b7e7-44aa52a9cfcf\") " pod="cert-manager/cert-manager-858654f9db-5w452" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.228487 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nxtz\" (UniqueName: \"kubernetes.io/projected/87d5459b-d306-4870-a3bd-d41bceeb6b50-kube-api-access-2nxtz\") pod \"cert-manager-cainjector-cf98fcc89-q6j7h\" (UID: \"87d5459b-d306-4870-a3bd-d41bceeb6b50\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.228516 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfd9b\" (UniqueName: \"kubernetes.io/projected/0756f4f4-4178-4716-b64e-33302d72d6de-kube-api-access-kfd9b\") pod \"cert-manager-webhook-687f57d79b-tgcp4\" (UID: \"0756f4f4-4178-4716-b64e-33302d72d6de\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.330175 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2xl6\" (UniqueName: \"kubernetes.io/projected/bbe529d1-f905-4175-b7e7-44aa52a9cfcf-kube-api-access-t2xl6\") pod \"cert-manager-858654f9db-5w452\" (UID: \"bbe529d1-f905-4175-b7e7-44aa52a9cfcf\") " pod="cert-manager/cert-manager-858654f9db-5w452" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.330322 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nxtz\" (UniqueName: \"kubernetes.io/projected/87d5459b-d306-4870-a3bd-d41bceeb6b50-kube-api-access-2nxtz\") pod \"cert-manager-cainjector-cf98fcc89-q6j7h\" (UID: \"87d5459b-d306-4870-a3bd-d41bceeb6b50\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.330356 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfd9b\" (UniqueName: \"kubernetes.io/projected/0756f4f4-4178-4716-b64e-33302d72d6de-kube-api-access-kfd9b\") pod \"cert-manager-webhook-687f57d79b-tgcp4\" (UID: \"0756f4f4-4178-4716-b64e-33302d72d6de\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.354271 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfd9b\" (UniqueName: \"kubernetes.io/projected/0756f4f4-4178-4716-b64e-33302d72d6de-kube-api-access-kfd9b\") pod \"cert-manager-webhook-687f57d79b-tgcp4\" (UID: \"0756f4f4-4178-4716-b64e-33302d72d6de\") " pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.354332 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nxtz\" (UniqueName: \"kubernetes.io/projected/87d5459b-d306-4870-a3bd-d41bceeb6b50-kube-api-access-2nxtz\") pod \"cert-manager-cainjector-cf98fcc89-q6j7h\" (UID: \"87d5459b-d306-4870-a3bd-d41bceeb6b50\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.354427 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2xl6\" (UniqueName: \"kubernetes.io/projected/bbe529d1-f905-4175-b7e7-44aa52a9cfcf-kube-api-access-t2xl6\") pod \"cert-manager-858654f9db-5w452\" (UID: \"bbe529d1-f905-4175-b7e7-44aa52a9cfcf\") " pod="cert-manager/cert-manager-858654f9db-5w452" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.410763 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.435835 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-5w452" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.443897 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.648838 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.932067 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-tgcp4"] Mar 20 11:07:43 crc kubenswrapper[4695]: I0320 11:07:43.934520 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-5w452"] Mar 20 11:07:44 crc kubenswrapper[4695]: I0320 11:07:44.222646 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" event={"ID":"87d5459b-d306-4870-a3bd-d41bceeb6b50","Type":"ContainerStarted","Data":"4b6773e18b25ffd74c5d4eac2ffed1c55c6153a47ec08c201f2cacf2ac378341"} Mar 20 11:07:44 crc kubenswrapper[4695]: I0320 11:07:44.225441 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-5w452" event={"ID":"bbe529d1-f905-4175-b7e7-44aa52a9cfcf","Type":"ContainerStarted","Data":"f78c47e06de18206b33b99e07075951dcc756cad99123b45123d3f4121717966"} Mar 20 11:07:44 crc kubenswrapper[4695]: I0320 11:07:44.227149 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" event={"ID":"0756f4f4-4178-4716-b64e-33302d72d6de","Type":"ContainerStarted","Data":"07bafbaba8388e7c214454da85a32452cd348efdf24b149bbd8fcb9c55c87f28"} Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.263787 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-5w452" event={"ID":"bbe529d1-f905-4175-b7e7-44aa52a9cfcf","Type":"ContainerStarted","Data":"d9a6623973b0abec77ced24a9dd0dd85a35eafbab60708e527d66eb75c458a69"} Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.266476 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" event={"ID":"87d5459b-d306-4870-a3bd-d41bceeb6b50","Type":"ContainerStarted","Data":"0ffb3c293fda95828475c31d1c4aed3bcd22dd0d91e5f021a26d973b28b682c3"} Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.269487 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" event={"ID":"0756f4f4-4178-4716-b64e-33302d72d6de","Type":"ContainerStarted","Data":"dce1a0e3ab0940eef5d9a56937d47948f71f36f505559886abe96eab956eb963"} Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.269718 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.287659 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-5w452" podStartSLOduration=1.980829529 podStartE2EDuration="5.287626289s" podCreationTimestamp="2026-03-20 11:07:43 +0000 UTC" firstStartedPulling="2026-03-20 11:07:43.93852951 +0000 UTC m=+841.719135073" lastFinishedPulling="2026-03-20 11:07:47.24532627 +0000 UTC m=+845.025931833" observedRunningTime="2026-03-20 11:07:48.285691909 +0000 UTC m=+846.066297482" watchObservedRunningTime="2026-03-20 11:07:48.287626289 +0000 UTC m=+846.068231852" Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.305183 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-q6j7h" podStartSLOduration=1.723701414 podStartE2EDuration="5.30515905s" podCreationTimestamp="2026-03-20 11:07:43 +0000 UTC" firstStartedPulling="2026-03-20 11:07:43.662610352 +0000 UTC m=+841.443215915" lastFinishedPulling="2026-03-20 11:07:47.244067988 +0000 UTC m=+845.024673551" observedRunningTime="2026-03-20 11:07:48.303114348 +0000 UTC m=+846.083719911" watchObservedRunningTime="2026-03-20 11:07:48.30515905 +0000 UTC m=+846.085764603" Mar 20 11:07:48 crc kubenswrapper[4695]: I0320 11:07:48.326461 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" podStartSLOduration=1.9592247170000001 podStartE2EDuration="5.326414508s" podCreationTimestamp="2026-03-20 11:07:43 +0000 UTC" firstStartedPulling="2026-03-20 11:07:43.934813509 +0000 UTC m=+841.715419072" lastFinishedPulling="2026-03-20 11:07:47.3020033 +0000 UTC m=+845.082608863" observedRunningTime="2026-03-20 11:07:48.326143431 +0000 UTC m=+846.106749004" watchObservedRunningTime="2026-03-20 11:07:48.326414508 +0000 UTC m=+846.107020071" Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.763979 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx4bc"] Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.764844 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-controller" containerID="cri-o://e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.764942 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.764996 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="sbdb" containerID="cri-o://d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.765036 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-node" containerID="cri-o://629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.765089 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-acl-logging" containerID="cri-o://a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.764942 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="nbdb" containerID="cri-o://8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.765643 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="northd" containerID="cri-o://f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" gracePeriod=30 Mar 20 11:07:51 crc kubenswrapper[4695]: I0320 11:07:51.805120 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" containerID="cri-o://69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" gracePeriod=30 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.136474 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/3.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.138428 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovn-acl-logging/0.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.138955 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovn-controller/0.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.139760 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.198755 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bq2lk"] Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199095 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-node" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199111 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-node" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199120 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199129 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199145 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="sbdb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199154 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="sbdb" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199166 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kubecfg-setup" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199173 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kubecfg-setup" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199188 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199196 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199207 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-acl-logging" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199213 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-acl-logging" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199226 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199233 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199243 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="northd" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199249 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="northd" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199256 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="nbdb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199262 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="nbdb" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199273 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199280 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199292 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199298 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199424 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199437 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="nbdb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199450 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-acl-logging" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199462 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199471 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199481 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="sbdb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199491 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovn-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199502 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199515 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="northd" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199526 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="kube-rbac-proxy-node" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199642 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199652 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.199666 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199674 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199791 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.199802 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerName="ovnkube-controller" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.207501 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275467 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-bin\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275562 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-script-lib\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275609 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-config\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275640 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-var-lib-openvswitch\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275673 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-slash\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275704 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-ovn\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275679 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275732 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-kubelet\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275759 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-node-log\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275781 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-slash" (OuterVolumeSpecName: "host-slash") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275787 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-ovn-kubernetes\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275812 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-netns\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275839 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-etc-openvswitch\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275866 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdz7d\" (UniqueName: \"kubernetes.io/projected/7010d107-c3b1-4cc2-83c2-523df13ecd43-kube-api-access-qdz7d\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275886 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-openvswitch\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275900 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-systemd-units\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275953 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovn-node-metrics-cert\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.275979 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-log-socket\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276007 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-var-lib-cni-networks-ovn-kubernetes\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276051 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-systemd\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276085 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-netd\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276169 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-env-overrides\") pod \"7010d107-c3b1-4cc2-83c2-523df13ecd43\" (UID: \"7010d107-c3b1-4cc2-83c2-523df13ecd43\") " Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276015 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276080 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276106 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276124 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-node-log" (OuterVolumeSpecName: "node-log") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276141 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276131 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276230 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276355 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-log-socket" (OuterVolumeSpecName: "log-socket") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276370 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-ovnkube-config\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276351 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276390 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276411 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-run-netns\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276522 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276576 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276567 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-ovnkube-script-lib\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276746 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276788 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.276897 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277135 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-systemd\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277216 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/899952c4-d6c4-4587-8031-51da28cb5645-ovn-node-metrics-cert\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277256 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-ovn\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277326 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277433 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-node-log\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277497 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-env-overrides\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277536 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277618 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-cni-bin\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277649 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-cni-netd\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277677 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-var-lib-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277700 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-etc-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277718 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-log-socket\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277775 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-kubelet\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277822 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx54z\" (UniqueName: \"kubernetes.io/projected/899952c4-d6c4-4587-8031-51da28cb5645-kube-api-access-mx54z\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.277874 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-slash\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278002 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-run-ovn-kubernetes\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278054 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-systemd-units\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278174 4695 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278201 4695 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278218 4695 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278231 4695 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278244 4695 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278256 4695 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278270 4695 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278283 4695 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278296 4695 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278307 4695 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278321 4695 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278334 4695 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278346 4695 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278358 4695 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278391 4695 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278404 4695 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.278420 4695 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.283285 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.283945 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7010d107-c3b1-4cc2-83c2-523df13ecd43-kube-api-access-qdz7d" (OuterVolumeSpecName: "kube-api-access-qdz7d") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "kube-api-access-qdz7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.292761 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "7010d107-c3b1-4cc2-83c2-523df13ecd43" (UID: "7010d107-c3b1-4cc2-83c2-523df13ecd43"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.296654 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovnkube-controller/3.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.300172 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovn-acl-logging/0.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.300867 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-nx4bc_7010d107-c3b1-4cc2-83c2-523df13ecd43/ovn-controller/0.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301355 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" exitCode=0 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301393 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" exitCode=0 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301403 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" exitCode=0 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301414 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" exitCode=0 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301425 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" exitCode=0 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301435 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" exitCode=0 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301445 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" exitCode=143 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301455 4695 generic.go:334] "Generic (PLEG): container finished" podID="7010d107-c3b1-4cc2-83c2-523df13ecd43" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" exitCode=143 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301440 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301514 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301533 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301548 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301587 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301565 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301734 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301754 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301769 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301776 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301618 4695 scope.go:117] "RemoveContainer" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301784 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301892 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301922 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301936 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301943 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301951 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301963 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301982 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.301993 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302000 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302007 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302016 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302028 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302036 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302043 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302050 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302058 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302069 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302083 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302093 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302102 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302109 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302117 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302128 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302136 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302143 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302151 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302158 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302169 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-nx4bc" event={"ID":"7010d107-c3b1-4cc2-83c2-523df13ecd43","Type":"ContainerDied","Data":"f0a97ee302090f2da0e2af64936cace6f5bfeff8d85c49d32e937fee4400cc7d"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302181 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302191 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302199 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302207 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302214 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302222 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302228 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302237 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302245 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.302253 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.303887 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/2.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.304485 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/1.log" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.304519 4695 generic.go:334] "Generic (PLEG): container finished" podID="52301735-de4f-4672-9e4d-6bd74bccedad" containerID="6288afe02e5624f347e826d8f85bfb546d5a45b435e01c0a0b3bd13018172586" exitCode=2 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.304541 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerDied","Data":"6288afe02e5624f347e826d8f85bfb546d5a45b435e01c0a0b3bd13018172586"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.304557 4695 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806"} Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.305589 4695 scope.go:117] "RemoveContainer" containerID="6288afe02e5624f347e826d8f85bfb546d5a45b435e01c0a0b3bd13018172586" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.329271 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.351632 4695 scope.go:117] "RemoveContainer" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.363396 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx4bc"] Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.370522 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-nx4bc"] Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.375334 4695 scope.go:117] "RemoveContainer" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379711 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-ovnkube-script-lib\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379751 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-systemd\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379788 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/899952c4-d6c4-4587-8031-51da28cb5645-ovn-node-metrics-cert\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379807 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-ovn\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379825 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379851 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-node-log\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379875 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-env-overrides\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379894 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379972 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-cni-bin\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.379994 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-cni-netd\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380013 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-var-lib-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380043 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-etc-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380059 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-log-socket\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380083 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-kubelet\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380100 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx54z\" (UniqueName: \"kubernetes.io/projected/899952c4-d6c4-4587-8031-51da28cb5645-kube-api-access-mx54z\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380122 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-slash\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380148 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-run-ovn-kubernetes\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380169 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-systemd-units\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380200 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-ovnkube-config\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380222 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-run-netns\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380267 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdz7d\" (UniqueName: \"kubernetes.io/projected/7010d107-c3b1-4cc2-83c2-523df13ecd43-kube-api-access-qdz7d\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380278 4695 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/7010d107-c3b1-4cc2-83c2-523df13ecd43-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380290 4695 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/7010d107-c3b1-4cc2-83c2-523df13ecd43-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380295 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-cni-netd\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380342 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-run-netns\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380380 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-var-lib-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.380982 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-etc-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381132 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-log-socket\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381157 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-kubelet\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381193 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-ovnkube-script-lib\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381253 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-systemd\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381531 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-slash\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381605 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-run-ovn-kubernetes\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.381646 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-systemd-units\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.382800 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-ovnkube-config\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.382875 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-node-log\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.382962 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-ovn\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.382997 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.383049 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-run-openvswitch\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.383635 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/899952c4-d6c4-4587-8031-51da28cb5645-env-overrides\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.383875 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/899952c4-d6c4-4587-8031-51da28cb5645-host-cni-bin\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.386123 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/899952c4-d6c4-4587-8031-51da28cb5645-ovn-node-metrics-cert\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.393017 4695 scope.go:117] "RemoveContainer" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.400534 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx54z\" (UniqueName: \"kubernetes.io/projected/899952c4-d6c4-4587-8031-51da28cb5645-kube-api-access-mx54z\") pod \"ovnkube-node-bq2lk\" (UID: \"899952c4-d6c4-4587-8031-51da28cb5645\") " pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.409558 4695 scope.go:117] "RemoveContainer" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.427096 4695 scope.go:117] "RemoveContainer" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.446946 4695 scope.go:117] "RemoveContainer" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.464218 4695 scope.go:117] "RemoveContainer" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.480581 4695 scope.go:117] "RemoveContainer" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.495505 4695 scope.go:117] "RemoveContainer" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.495877 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": container with ID starting with 69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094 not found: ID does not exist" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.495940 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} err="failed to get container status \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": rpc error: code = NotFound desc = could not find container \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": container with ID starting with 69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.495970 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.496242 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": container with ID starting with dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044 not found: ID does not exist" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.496269 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} err="failed to get container status \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": rpc error: code = NotFound desc = could not find container \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": container with ID starting with dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.496285 4695 scope.go:117] "RemoveContainer" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.496543 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": container with ID starting with d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3 not found: ID does not exist" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.496575 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} err="failed to get container status \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": rpc error: code = NotFound desc = could not find container \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": container with ID starting with d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.496596 4695 scope.go:117] "RemoveContainer" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.496845 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": container with ID starting with 8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84 not found: ID does not exist" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.496872 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} err="failed to get container status \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": rpc error: code = NotFound desc = could not find container \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": container with ID starting with 8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.496891 4695 scope.go:117] "RemoveContainer" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.497147 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": container with ID starting with f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596 not found: ID does not exist" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.497168 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} err="failed to get container status \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": rpc error: code = NotFound desc = could not find container \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": container with ID starting with f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.497182 4695 scope.go:117] "RemoveContainer" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.497573 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": container with ID starting with b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380 not found: ID does not exist" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.497638 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} err="failed to get container status \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": rpc error: code = NotFound desc = could not find container \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": container with ID starting with b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.497676 4695 scope.go:117] "RemoveContainer" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.498093 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": container with ID starting with 629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866 not found: ID does not exist" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.498125 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} err="failed to get container status \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": rpc error: code = NotFound desc = could not find container \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": container with ID starting with 629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.498140 4695 scope.go:117] "RemoveContainer" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.498673 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": container with ID starting with a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38 not found: ID does not exist" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.498700 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} err="failed to get container status \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": rpc error: code = NotFound desc = could not find container \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": container with ID starting with a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.498714 4695 scope.go:117] "RemoveContainer" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.499154 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": container with ID starting with e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb not found: ID does not exist" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.499184 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} err="failed to get container status \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": rpc error: code = NotFound desc = could not find container \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": container with ID starting with e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.499204 4695 scope.go:117] "RemoveContainer" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" Mar 20 11:07:52 crc kubenswrapper[4695]: E0320 11:07:52.499444 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": container with ID starting with 253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f not found: ID does not exist" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.499477 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} err="failed to get container status \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": rpc error: code = NotFound desc = could not find container \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": container with ID starting with 253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.499492 4695 scope.go:117] "RemoveContainer" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.499818 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} err="failed to get container status \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": rpc error: code = NotFound desc = could not find container \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": container with ID starting with 69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.499840 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.500237 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} err="failed to get container status \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": rpc error: code = NotFound desc = could not find container \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": container with ID starting with dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.500258 4695 scope.go:117] "RemoveContainer" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.500564 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} err="failed to get container status \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": rpc error: code = NotFound desc = could not find container \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": container with ID starting with d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.500584 4695 scope.go:117] "RemoveContainer" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.500833 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} err="failed to get container status \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": rpc error: code = NotFound desc = could not find container \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": container with ID starting with 8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.500849 4695 scope.go:117] "RemoveContainer" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.501407 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} err="failed to get container status \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": rpc error: code = NotFound desc = could not find container \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": container with ID starting with f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.501440 4695 scope.go:117] "RemoveContainer" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.501704 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} err="failed to get container status \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": rpc error: code = NotFound desc = could not find container \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": container with ID starting with b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.501722 4695 scope.go:117] "RemoveContainer" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.501940 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} err="failed to get container status \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": rpc error: code = NotFound desc = could not find container \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": container with ID starting with 629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.501960 4695 scope.go:117] "RemoveContainer" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.502270 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} err="failed to get container status \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": rpc error: code = NotFound desc = could not find container \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": container with ID starting with a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.502293 4695 scope.go:117] "RemoveContainer" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.502589 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} err="failed to get container status \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": rpc error: code = NotFound desc = could not find container \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": container with ID starting with e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.502607 4695 scope.go:117] "RemoveContainer" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.502813 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} err="failed to get container status \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": rpc error: code = NotFound desc = could not find container \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": container with ID starting with 253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.502834 4695 scope.go:117] "RemoveContainer" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.503318 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} err="failed to get container status \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": rpc error: code = NotFound desc = could not find container \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": container with ID starting with 69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.503345 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.503649 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} err="failed to get container status \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": rpc error: code = NotFound desc = could not find container \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": container with ID starting with dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.503682 4695 scope.go:117] "RemoveContainer" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.504098 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} err="failed to get container status \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": rpc error: code = NotFound desc = could not find container \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": container with ID starting with d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.504122 4695 scope.go:117] "RemoveContainer" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.504377 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} err="failed to get container status \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": rpc error: code = NotFound desc = could not find container \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": container with ID starting with 8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.504408 4695 scope.go:117] "RemoveContainer" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.504689 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} err="failed to get container status \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": rpc error: code = NotFound desc = could not find container \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": container with ID starting with f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.504711 4695 scope.go:117] "RemoveContainer" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.505057 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} err="failed to get container status \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": rpc error: code = NotFound desc = could not find container \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": container with ID starting with b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.505078 4695 scope.go:117] "RemoveContainer" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.505827 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} err="failed to get container status \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": rpc error: code = NotFound desc = could not find container \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": container with ID starting with 629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.505851 4695 scope.go:117] "RemoveContainer" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.506295 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} err="failed to get container status \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": rpc error: code = NotFound desc = could not find container \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": container with ID starting with a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.506318 4695 scope.go:117] "RemoveContainer" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.506698 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} err="failed to get container status \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": rpc error: code = NotFound desc = could not find container \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": container with ID starting with e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.506725 4695 scope.go:117] "RemoveContainer" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507050 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} err="failed to get container status \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": rpc error: code = NotFound desc = could not find container \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": container with ID starting with 253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507073 4695 scope.go:117] "RemoveContainer" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507320 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} err="failed to get container status \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": rpc error: code = NotFound desc = could not find container \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": container with ID starting with 69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507344 4695 scope.go:117] "RemoveContainer" containerID="dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507604 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044"} err="failed to get container status \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": rpc error: code = NotFound desc = could not find container \"dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044\": container with ID starting with dde67bf71d474833f80ee720e52fab2f14bad8aaadf48f967dd9efa8f98f9044 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507624 4695 scope.go:117] "RemoveContainer" containerID="d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507897 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3"} err="failed to get container status \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": rpc error: code = NotFound desc = could not find container \"d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3\": container with ID starting with d9d631e04ac9e6fee0bd8a51c55c3095840627daec846ed4da8da489ef53c1f3 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.507943 4695 scope.go:117] "RemoveContainer" containerID="8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.508301 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84"} err="failed to get container status \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": rpc error: code = NotFound desc = could not find container \"8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84\": container with ID starting with 8127b81eed0bc7ef1b498ca377c7b8fbf0ba1cd4e4fcab600f1ebe98bd028c84 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.508325 4695 scope.go:117] "RemoveContainer" containerID="f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.508528 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596"} err="failed to get container status \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": rpc error: code = NotFound desc = could not find container \"f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596\": container with ID starting with f3efa8015eebf85cfc9b73424dfcf788a2f633fc46105d6fdc17ff3d24094596 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.508549 4695 scope.go:117] "RemoveContainer" containerID="b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.508893 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380"} err="failed to get container status \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": rpc error: code = NotFound desc = could not find container \"b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380\": container with ID starting with b53e329c7f89cd48648ef748ecea0da28995ccbcd9916ca218550293eaa2b380 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.508935 4695 scope.go:117] "RemoveContainer" containerID="629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.509159 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866"} err="failed to get container status \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": rpc error: code = NotFound desc = could not find container \"629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866\": container with ID starting with 629c6292eb9c960f8412a2d40ad100504a70107288cd22b35d57d505a6386866 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.509181 4695 scope.go:117] "RemoveContainer" containerID="a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.509481 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38"} err="failed to get container status \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": rpc error: code = NotFound desc = could not find container \"a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38\": container with ID starting with a3bd72838aed093f030052b548ecced7da54f5c3c46d6da15f8b515c750dcf38 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.509507 4695 scope.go:117] "RemoveContainer" containerID="e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.509790 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb"} err="failed to get container status \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": rpc error: code = NotFound desc = could not find container \"e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb\": container with ID starting with e30c5e7180d2523aef8cf1bb9a4347088451b9edd3997503b616cf2a74a87ebb not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.509810 4695 scope.go:117] "RemoveContainer" containerID="253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.510107 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f"} err="failed to get container status \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": rpc error: code = NotFound desc = could not find container \"253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f\": container with ID starting with 253af5c0eb0bc259313e9ea4001fe5bfbeba1b443ec73571b3d9e90e5895345f not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.510129 4695 scope.go:117] "RemoveContainer" containerID="69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.510421 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094"} err="failed to get container status \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": rpc error: code = NotFound desc = could not find container \"69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094\": container with ID starting with 69459deacf6e07427dc7e702a198147a64a4dfe9ee33aeb1d5ba84d3b3562094 not found: ID does not exist" Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.535892 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:07:52 crc kubenswrapper[4695]: W0320 11:07:52.553094 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899952c4_d6c4_4587_8031_51da28cb5645.slice/crio-af712189f3339208e3218e5b4fb2a3afa59704859a45d6be642a68bd3fffc582 WatchSource:0}: Error finding container af712189f3339208e3218e5b4fb2a3afa59704859a45d6be642a68bd3fffc582: Status 404 returned error can't find the container with id af712189f3339208e3218e5b4fb2a3afa59704859a45d6be642a68bd3fffc582 Mar 20 11:07:52 crc kubenswrapper[4695]: I0320 11:07:52.897381 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7010d107-c3b1-4cc2-83c2-523df13ecd43" path="/var/lib/kubelet/pods/7010d107-c3b1-4cc2-83c2-523df13ecd43/volumes" Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.316328 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/2.log" Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.316997 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/1.log" Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.317083 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hg7g5" event={"ID":"52301735-de4f-4672-9e4d-6bd74bccedad","Type":"ContainerStarted","Data":"432fc9f9c488a98e6267cd67d376e21fe7bcd059f59b3eaa3d7379c417016df2"} Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.324874 4695 generic.go:334] "Generic (PLEG): container finished" podID="899952c4-d6c4-4587-8031-51da28cb5645" containerID="cdba506398996b98e5e542d26f05e895bdd959076f1e93f1eac6f430f2718456" exitCode=0 Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.324990 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerDied","Data":"cdba506398996b98e5e542d26f05e895bdd959076f1e93f1eac6f430f2718456"} Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.325033 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"af712189f3339208e3218e5b4fb2a3afa59704859a45d6be642a68bd3fffc582"} Mar 20 11:07:53 crc kubenswrapper[4695]: I0320 11:07:53.446464 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-tgcp4" Mar 20 11:07:54 crc kubenswrapper[4695]: I0320 11:07:54.335508 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"c308e5e62ba2375f7e400a239a0c1576f6fe50b686701a94ebdefac3e78f736c"} Mar 20 11:07:54 crc kubenswrapper[4695]: I0320 11:07:54.336060 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"600fe8f96648f01cd17c25a0d005d18829a7abf3f386ddca803191ffe4e38a3c"} Mar 20 11:07:54 crc kubenswrapper[4695]: I0320 11:07:54.336075 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"7c44c28ebd27eb1de7512c52083742ebe44fe0f21cefd3885d96f168a894f8f8"} Mar 20 11:07:54 crc kubenswrapper[4695]: I0320 11:07:54.336085 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"6ba977ef0a00fcc4d713d3013a93cdeb229d4c3bb613319e29fd9c58ce988ed2"} Mar 20 11:07:54 crc kubenswrapper[4695]: I0320 11:07:54.336096 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"d1b461b98223275159091d103a9e854ea78992e7fa16ab8707943b57f9397500"} Mar 20 11:07:54 crc kubenswrapper[4695]: I0320 11:07:54.336107 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"7188a984e3eaa8affd03ccfbe331269967a1c8822f9822114b3e3d4ea17db2f5"} Mar 20 11:07:57 crc kubenswrapper[4695]: I0320 11:07:57.357643 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"a6ce77b6cb99cbfb2c658248021e5f6a23738b748dea2c3c2f354deeef27c7d4"} Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.154004 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566748-jlvx2"] Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.173743 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.179687 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.179973 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.183541 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.321228 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4vb\" (UniqueName: \"kubernetes.io/projected/f15d73b7-72cb-4cee-8019-e27bf7c0e646-kube-api-access-hn4vb\") pod \"auto-csr-approver-29566748-jlvx2\" (UID: \"f15d73b7-72cb-4cee-8019-e27bf7c0e646\") " pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.387383 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" event={"ID":"899952c4-d6c4-4587-8031-51da28cb5645","Type":"ContainerStarted","Data":"d95eb3d2325456dd27b2036f8741788436e3832a2d38b6028950f716baae7179"} Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.387810 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.387877 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.420000 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.422920 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4vb\" (UniqueName: \"kubernetes.io/projected/f15d73b7-72cb-4cee-8019-e27bf7c0e646-kube-api-access-hn4vb\") pod \"auto-csr-approver-29566748-jlvx2\" (UID: \"f15d73b7-72cb-4cee-8019-e27bf7c0e646\") " pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.426481 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" podStartSLOduration=8.426448223 podStartE2EDuration="8.426448223s" podCreationTimestamp="2026-03-20 11:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:08:00.423233271 +0000 UTC m=+858.203838854" watchObservedRunningTime="2026-03-20 11:08:00.426448223 +0000 UTC m=+858.207053786" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.452121 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4vb\" (UniqueName: \"kubernetes.io/projected/f15d73b7-72cb-4cee-8019-e27bf7c0e646-kube-api-access-hn4vb\") pod \"auto-csr-approver-29566748-jlvx2\" (UID: \"f15d73b7-72cb-4cee-8019-e27bf7c0e646\") " pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.497845 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: E0320 11:08:00.525556 4695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(bc9d8613ff3b61a9736658932c41e773974991335ee635962f3a205e6282eb4d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:08:00 crc kubenswrapper[4695]: E0320 11:08:00.525656 4695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(bc9d8613ff3b61a9736658932c41e773974991335ee635962f3a205e6282eb4d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: E0320 11:08:00.525684 4695 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(bc9d8613ff3b61a9736658932c41e773974991335ee635962f3a205e6282eb4d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:00 crc kubenswrapper[4695]: E0320 11:08:00.525748 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566748-jlvx2_openshift-infra(f15d73b7-72cb-4cee-8019-e27bf7c0e646)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566748-jlvx2_openshift-infra(f15d73b7-72cb-4cee-8019-e27bf7c0e646)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(bc9d8613ff3b61a9736658932c41e773974991335ee635962f3a205e6282eb4d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" podUID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" Mar 20 11:08:00 crc kubenswrapper[4695]: I0320 11:08:00.643074 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-jlvx2"] Mar 20 11:08:01 crc kubenswrapper[4695]: I0320 11:08:01.393069 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:01 crc kubenswrapper[4695]: I0320 11:08:01.393519 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:01 crc kubenswrapper[4695]: I0320 11:08:01.393649 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:08:01 crc kubenswrapper[4695]: E0320 11:08:01.418576 4695 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(02bd94487bb53efa000a27d02f9f57fd3a123fd336d8ff2fde17215b93861e8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 11:08:01 crc kubenswrapper[4695]: E0320 11:08:01.418996 4695 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(02bd94487bb53efa000a27d02f9f57fd3a123fd336d8ff2fde17215b93861e8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:01 crc kubenswrapper[4695]: E0320 11:08:01.419033 4695 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(02bd94487bb53efa000a27d02f9f57fd3a123fd336d8ff2fde17215b93861e8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:01 crc kubenswrapper[4695]: E0320 11:08:01.419200 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"auto-csr-approver-29566748-jlvx2_openshift-infra(f15d73b7-72cb-4cee-8019-e27bf7c0e646)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"auto-csr-approver-29566748-jlvx2_openshift-infra(f15d73b7-72cb-4cee-8019-e27bf7c0e646)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_auto-csr-approver-29566748-jlvx2_openshift-infra_f15d73b7-72cb-4cee-8019-e27bf7c0e646_0(02bd94487bb53efa000a27d02f9f57fd3a123fd336d8ff2fde17215b93861e8d): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" podUID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" Mar 20 11:08:01 crc kubenswrapper[4695]: I0320 11:08:01.430941 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:08:05 crc kubenswrapper[4695]: I0320 11:08:05.127672 4695 scope.go:117] "RemoveContainer" containerID="05945c73a4c93e5fd6ce1bc322d9acf71d5c1005cd47a22875bfe3b8a3eb8806" Mar 20 11:08:05 crc kubenswrapper[4695]: I0320 11:08:05.422500 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-hg7g5_52301735-de4f-4672-9e4d-6bd74bccedad/kube-multus/2.log" Mar 20 11:08:15 crc kubenswrapper[4695]: I0320 11:08:15.886361 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:15 crc kubenswrapper[4695]: I0320 11:08:15.887637 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:16 crc kubenswrapper[4695]: I0320 11:08:16.140759 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-jlvx2"] Mar 20 11:08:16 crc kubenswrapper[4695]: I0320 11:08:16.509262 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" event={"ID":"f15d73b7-72cb-4cee-8019-e27bf7c0e646","Type":"ContainerStarted","Data":"a465d4c5f40a41af45fdcbbefca30b74b65f82e459c815e46e9febeef959a8c7"} Mar 20 11:08:17 crc kubenswrapper[4695]: I0320 11:08:17.520628 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" event={"ID":"f15d73b7-72cb-4cee-8019-e27bf7c0e646","Type":"ContainerStarted","Data":"8c51d9c542e5473472702678a9eb944d20800febece8e2918120e4a01e93b127"} Mar 20 11:08:17 crc kubenswrapper[4695]: I0320 11:08:17.539369 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" podStartSLOduration=16.522718088 podStartE2EDuration="17.539334815s" podCreationTimestamp="2026-03-20 11:08:00 +0000 UTC" firstStartedPulling="2026-03-20 11:08:16.150099163 +0000 UTC m=+873.930704726" lastFinishedPulling="2026-03-20 11:08:17.16671589 +0000 UTC m=+874.947321453" observedRunningTime="2026-03-20 11:08:17.536888362 +0000 UTC m=+875.317493935" watchObservedRunningTime="2026-03-20 11:08:17.539334815 +0000 UTC m=+875.319940378" Mar 20 11:08:18 crc kubenswrapper[4695]: I0320 11:08:18.530661 4695 generic.go:334] "Generic (PLEG): container finished" podID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" containerID="8c51d9c542e5473472702678a9eb944d20800febece8e2918120e4a01e93b127" exitCode=0 Mar 20 11:08:18 crc kubenswrapper[4695]: I0320 11:08:18.530804 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" event={"ID":"f15d73b7-72cb-4cee-8019-e27bf7c0e646","Type":"ContainerDied","Data":"8c51d9c542e5473472702678a9eb944d20800febece8e2918120e4a01e93b127"} Mar 20 11:08:19 crc kubenswrapper[4695]: I0320 11:08:19.769248 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:19 crc kubenswrapper[4695]: I0320 11:08:19.837732 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn4vb\" (UniqueName: \"kubernetes.io/projected/f15d73b7-72cb-4cee-8019-e27bf7c0e646-kube-api-access-hn4vb\") pod \"f15d73b7-72cb-4cee-8019-e27bf7c0e646\" (UID: \"f15d73b7-72cb-4cee-8019-e27bf7c0e646\") " Mar 20 11:08:19 crc kubenswrapper[4695]: I0320 11:08:19.845650 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f15d73b7-72cb-4cee-8019-e27bf7c0e646-kube-api-access-hn4vb" (OuterVolumeSpecName: "kube-api-access-hn4vb") pod "f15d73b7-72cb-4cee-8019-e27bf7c0e646" (UID: "f15d73b7-72cb-4cee-8019-e27bf7c0e646"). InnerVolumeSpecName "kube-api-access-hn4vb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:19 crc kubenswrapper[4695]: I0320 11:08:19.940520 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn4vb\" (UniqueName: \"kubernetes.io/projected/f15d73b7-72cb-4cee-8019-e27bf7c0e646-kube-api-access-hn4vb\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:20 crc kubenswrapper[4695]: I0320 11:08:20.551138 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" event={"ID":"f15d73b7-72cb-4cee-8019-e27bf7c0e646","Type":"ContainerDied","Data":"a465d4c5f40a41af45fdcbbefca30b74b65f82e459c815e46e9febeef959a8c7"} Mar 20 11:08:20 crc kubenswrapper[4695]: I0320 11:08:20.551188 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-jlvx2" Mar 20 11:08:20 crc kubenswrapper[4695]: I0320 11:08:20.551196 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a465d4c5f40a41af45fdcbbefca30b74b65f82e459c815e46e9febeef959a8c7" Mar 20 11:08:20 crc kubenswrapper[4695]: I0320 11:08:20.594123 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-m6jqc"] Mar 20 11:08:20 crc kubenswrapper[4695]: I0320 11:08:20.597387 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-m6jqc"] Mar 20 11:08:20 crc kubenswrapper[4695]: I0320 11:08:20.896439 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7961256-fd7f-48a0-ab69-bc6c6bb0f37a" path="/var/lib/kubelet/pods/b7961256-fd7f-48a0-ab69-bc6c6bb0f37a/volumes" Mar 20 11:08:22 crc kubenswrapper[4695]: I0320 11:08:22.573007 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bq2lk" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.286430 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq"] Mar 20 11:08:36 crc kubenswrapper[4695]: E0320 11:08:36.287396 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" containerName="oc" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.287411 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" containerName="oc" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.287544 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" containerName="oc" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.288509 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.291653 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.300799 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq"] Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.451096 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.451183 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrbm\" (UniqueName: \"kubernetes.io/projected/dc81d3d9-b38c-47dd-9429-21ea474dd393-kube-api-access-mqrbm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.451243 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.552751 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrbm\" (UniqueName: \"kubernetes.io/projected/dc81d3d9-b38c-47dd-9429-21ea474dd393-kube-api-access-mqrbm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.552850 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.552895 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.553539 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.553758 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.576816 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrbm\" (UniqueName: \"kubernetes.io/projected/dc81d3d9-b38c-47dd-9429-21ea474dd393-kube-api-access-mqrbm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.614795 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:36 crc kubenswrapper[4695]: I0320 11:08:36.860859 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq"] Mar 20 11:08:37 crc kubenswrapper[4695]: I0320 11:08:37.672121 4695 generic.go:334] "Generic (PLEG): container finished" podID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerID="e4914af8a36b34578389068c4e5e4b71354453982a4219f6b7eafb87c0d871c1" exitCode=0 Mar 20 11:08:37 crc kubenswrapper[4695]: I0320 11:08:37.672252 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" event={"ID":"dc81d3d9-b38c-47dd-9429-21ea474dd393","Type":"ContainerDied","Data":"e4914af8a36b34578389068c4e5e4b71354453982a4219f6b7eafb87c0d871c1"} Mar 20 11:08:37 crc kubenswrapper[4695]: I0320 11:08:37.672595 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" event={"ID":"dc81d3d9-b38c-47dd-9429-21ea474dd393","Type":"ContainerStarted","Data":"9d51990a2c9379ddb9beffa506448bdac2edd24556435b165612c0bcba498b59"} Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.052209 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8p6fl"] Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.053690 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.060271 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p6fl"] Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.075016 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2w2b\" (UniqueName: \"kubernetes.io/projected/1dac6a8e-7989-40f1-b930-40f33a30db9b-kube-api-access-c2w2b\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.075170 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-utilities\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.075242 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-catalog-content\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.176108 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2w2b\" (UniqueName: \"kubernetes.io/projected/1dac6a8e-7989-40f1-b930-40f33a30db9b-kube-api-access-c2w2b\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.176224 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-utilities\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.176274 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-catalog-content\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.176862 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-utilities\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.177040 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-catalog-content\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.198241 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2w2b\" (UniqueName: \"kubernetes.io/projected/1dac6a8e-7989-40f1-b930-40f33a30db9b-kube-api-access-c2w2b\") pod \"redhat-operators-8p6fl\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.376898 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:38 crc kubenswrapper[4695]: I0320 11:08:38.823754 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8p6fl"] Mar 20 11:08:39 crc kubenswrapper[4695]: I0320 11:08:39.688572 4695 generic.go:334] "Generic (PLEG): container finished" podID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerID="566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e" exitCode=0 Mar 20 11:08:39 crc kubenswrapper[4695]: I0320 11:08:39.688682 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerDied","Data":"566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e"} Mar 20 11:08:39 crc kubenswrapper[4695]: I0320 11:08:39.689100 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerStarted","Data":"b4d880be15d305d1542af56adb027fefdfaf71bc234808ad4a99800293ce6c43"} Mar 20 11:08:39 crc kubenswrapper[4695]: I0320 11:08:39.694493 4695 generic.go:334] "Generic (PLEG): container finished" podID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerID="8c3919f14f3e8ed333cbcfa1cf44ca7b601385204a3ee67e94ba01ebfac9c871" exitCode=0 Mar 20 11:08:39 crc kubenswrapper[4695]: I0320 11:08:39.694538 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" event={"ID":"dc81d3d9-b38c-47dd-9429-21ea474dd393","Type":"ContainerDied","Data":"8c3919f14f3e8ed333cbcfa1cf44ca7b601385204a3ee67e94ba01ebfac9c871"} Mar 20 11:08:40 crc kubenswrapper[4695]: I0320 11:08:40.703546 4695 generic.go:334] "Generic (PLEG): container finished" podID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerID="bb34f5dc602c2d65bab1890c1fc07604043c9d8ff9a2899fbeb642385f8568f1" exitCode=0 Mar 20 11:08:40 crc kubenswrapper[4695]: I0320 11:08:40.703627 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" event={"ID":"dc81d3d9-b38c-47dd-9429-21ea474dd393","Type":"ContainerDied","Data":"bb34f5dc602c2d65bab1890c1fc07604043c9d8ff9a2899fbeb642385f8568f1"} Mar 20 11:08:41 crc kubenswrapper[4695]: I0320 11:08:41.714707 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerStarted","Data":"c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28"} Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.447228 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.543424 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-bundle\") pod \"dc81d3d9-b38c-47dd-9429-21ea474dd393\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.543634 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrbm\" (UniqueName: \"kubernetes.io/projected/dc81d3d9-b38c-47dd-9429-21ea474dd393-kube-api-access-mqrbm\") pod \"dc81d3d9-b38c-47dd-9429-21ea474dd393\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.543755 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-util\") pod \"dc81d3d9-b38c-47dd-9429-21ea474dd393\" (UID: \"dc81d3d9-b38c-47dd-9429-21ea474dd393\") " Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.545318 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-bundle" (OuterVolumeSpecName: "bundle") pod "dc81d3d9-b38c-47dd-9429-21ea474dd393" (UID: "dc81d3d9-b38c-47dd-9429-21ea474dd393"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.585728 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc81d3d9-b38c-47dd-9429-21ea474dd393-kube-api-access-mqrbm" (OuterVolumeSpecName: "kube-api-access-mqrbm") pod "dc81d3d9-b38c-47dd-9429-21ea474dd393" (UID: "dc81d3d9-b38c-47dd-9429-21ea474dd393"). InnerVolumeSpecName "kube-api-access-mqrbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.645728 4695 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.645790 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrbm\" (UniqueName: \"kubernetes.io/projected/dc81d3d9-b38c-47dd-9429-21ea474dd393-kube-api-access-mqrbm\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.655869 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-util" (OuterVolumeSpecName: "util") pod "dc81d3d9-b38c-47dd-9429-21ea474dd393" (UID: "dc81d3d9-b38c-47dd-9429-21ea474dd393"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.724052 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.725266 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq" event={"ID":"dc81d3d9-b38c-47dd-9429-21ea474dd393","Type":"ContainerDied","Data":"9d51990a2c9379ddb9beffa506448bdac2edd24556435b165612c0bcba498b59"} Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.725425 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d51990a2c9379ddb9beffa506448bdac2edd24556435b165612c0bcba498b59" Mar 20 11:08:42 crc kubenswrapper[4695]: I0320 11:08:42.747331 4695 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dc81d3d9-b38c-47dd-9429-21ea474dd393-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:43 crc kubenswrapper[4695]: I0320 11:08:43.732898 4695 generic.go:334] "Generic (PLEG): container finished" podID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerID="c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28" exitCode=0 Mar 20 11:08:43 crc kubenswrapper[4695]: I0320 11:08:43.732960 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerDied","Data":"c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28"} Mar 20 11:08:44 crc kubenswrapper[4695]: I0320 11:08:44.744657 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerStarted","Data":"e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be"} Mar 20 11:08:44 crc kubenswrapper[4695]: I0320 11:08:44.768494 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8p6fl" podStartSLOduration=2.331618275 podStartE2EDuration="6.768475671s" podCreationTimestamp="2026-03-20 11:08:38 +0000 UTC" firstStartedPulling="2026-03-20 11:08:39.690396395 +0000 UTC m=+897.471001958" lastFinishedPulling="2026-03-20 11:08:44.127253791 +0000 UTC m=+901.907859354" observedRunningTime="2026-03-20 11:08:44.764967921 +0000 UTC m=+902.545573484" watchObservedRunningTime="2026-03-20 11:08:44.768475671 +0000 UTC m=+902.549081224" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.974346 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2"] Mar 20 11:08:46 crc kubenswrapper[4695]: E0320 11:08:46.974756 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="extract" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.974773 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="extract" Mar 20 11:08:46 crc kubenswrapper[4695]: E0320 11:08:46.974788 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="pull" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.974794 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="pull" Mar 20 11:08:46 crc kubenswrapper[4695]: E0320 11:08:46.974819 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="util" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.974825 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="util" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.974959 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc81d3d9-b38c-47dd-9429-21ea474dd393" containerName="extract" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.975533 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.978842 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.979088 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.979302 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gxhfp" Mar 20 11:08:46 crc kubenswrapper[4695]: I0320 11:08:46.989507 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2"] Mar 20 11:08:47 crc kubenswrapper[4695]: I0320 11:08:47.119523 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4qcp\" (UniqueName: \"kubernetes.io/projected/1fd45e0f-120d-45c8-99c5-f8fc5e043df6-kube-api-access-n4qcp\") pod \"nmstate-operator-796d4cfff4-7xvf2\" (UID: \"1fd45e0f-120d-45c8-99c5-f8fc5e043df6\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" Mar 20 11:08:47 crc kubenswrapper[4695]: I0320 11:08:47.221407 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4qcp\" (UniqueName: \"kubernetes.io/projected/1fd45e0f-120d-45c8-99c5-f8fc5e043df6-kube-api-access-n4qcp\") pod \"nmstate-operator-796d4cfff4-7xvf2\" (UID: \"1fd45e0f-120d-45c8-99c5-f8fc5e043df6\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" Mar 20 11:08:47 crc kubenswrapper[4695]: I0320 11:08:47.247250 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4qcp\" (UniqueName: \"kubernetes.io/projected/1fd45e0f-120d-45c8-99c5-f8fc5e043df6-kube-api-access-n4qcp\") pod \"nmstate-operator-796d4cfff4-7xvf2\" (UID: \"1fd45e0f-120d-45c8-99c5-f8fc5e043df6\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" Mar 20 11:08:47 crc kubenswrapper[4695]: I0320 11:08:47.296610 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" Mar 20 11:08:47 crc kubenswrapper[4695]: I0320 11:08:47.866046 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2"] Mar 20 11:08:48 crc kubenswrapper[4695]: I0320 11:08:48.377269 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:48 crc kubenswrapper[4695]: I0320 11:08:48.379261 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:48 crc kubenswrapper[4695]: I0320 11:08:48.815165 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" event={"ID":"1fd45e0f-120d-45c8-99c5-f8fc5e043df6","Type":"ContainerStarted","Data":"ec8419eb3ffb9ff43c2a0805972edb8f678511c160ebf89d0637e15ac4f8d1a9"} Mar 20 11:08:49 crc kubenswrapper[4695]: I0320 11:08:49.426572 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8p6fl" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="registry-server" probeResult="failure" output=< Mar 20 11:08:49 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 11:08:49 crc kubenswrapper[4695]: > Mar 20 11:08:58 crc kubenswrapper[4695]: I0320 11:08:58.418990 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:58 crc kubenswrapper[4695]: I0320 11:08:58.471368 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:08:58 crc kubenswrapper[4695]: I0320 11:08:58.651704 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8p6fl"] Mar 20 11:08:59 crc kubenswrapper[4695]: I0320 11:08:59.892045 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8p6fl" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="registry-server" containerID="cri-o://e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be" gracePeriod=2 Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.760676 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.862983 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2w2b\" (UniqueName: \"kubernetes.io/projected/1dac6a8e-7989-40f1-b930-40f33a30db9b-kube-api-access-c2w2b\") pod \"1dac6a8e-7989-40f1-b930-40f33a30db9b\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.863112 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-catalog-content\") pod \"1dac6a8e-7989-40f1-b930-40f33a30db9b\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.863136 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-utilities\") pod \"1dac6a8e-7989-40f1-b930-40f33a30db9b\" (UID: \"1dac6a8e-7989-40f1-b930-40f33a30db9b\") " Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.864654 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-utilities" (OuterVolumeSpecName: "utilities") pod "1dac6a8e-7989-40f1-b930-40f33a30db9b" (UID: "1dac6a8e-7989-40f1-b930-40f33a30db9b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.871569 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dac6a8e-7989-40f1-b930-40f33a30db9b-kube-api-access-c2w2b" (OuterVolumeSpecName: "kube-api-access-c2w2b") pod "1dac6a8e-7989-40f1-b930-40f33a30db9b" (UID: "1dac6a8e-7989-40f1-b930-40f33a30db9b"). InnerVolumeSpecName "kube-api-access-c2w2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.900790 4695 generic.go:334] "Generic (PLEG): container finished" podID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerID="e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be" exitCode=0 Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.900961 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8p6fl" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.902926 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerDied","Data":"e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be"} Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.902982 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8p6fl" event={"ID":"1dac6a8e-7989-40f1-b930-40f33a30db9b","Type":"ContainerDied","Data":"b4d880be15d305d1542af56adb027fefdfaf71bc234808ad4a99800293ce6c43"} Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.903045 4695 scope.go:117] "RemoveContainer" containerID="e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.926435 4695 scope.go:117] "RemoveContainer" containerID="c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.948843 4695 scope.go:117] "RemoveContainer" containerID="566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.964727 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2w2b\" (UniqueName: \"kubernetes.io/projected/1dac6a8e-7989-40f1-b930-40f33a30db9b-kube-api-access-c2w2b\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.964782 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.967708 4695 scope.go:117] "RemoveContainer" containerID="e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be" Mar 20 11:09:00 crc kubenswrapper[4695]: E0320 11:09:00.968317 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be\": container with ID starting with e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be not found: ID does not exist" containerID="e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.968355 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be"} err="failed to get container status \"e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be\": rpc error: code = NotFound desc = could not find container \"e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be\": container with ID starting with e7a4813d7d3f13378b10a16bb885cbabd83bc123f4074af139cca4ef4830c4be not found: ID does not exist" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.968382 4695 scope.go:117] "RemoveContainer" containerID="c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28" Mar 20 11:09:00 crc kubenswrapper[4695]: E0320 11:09:00.968853 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28\": container with ID starting with c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28 not found: ID does not exist" containerID="c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.968959 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28"} err="failed to get container status \"c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28\": rpc error: code = NotFound desc = could not find container \"c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28\": container with ID starting with c93937769decd5c8773798be67439d854ce64006fda6cba274c08b42f5d46c28 not found: ID does not exist" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.969016 4695 scope.go:117] "RemoveContainer" containerID="566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e" Mar 20 11:09:00 crc kubenswrapper[4695]: E0320 11:09:00.969480 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e\": container with ID starting with 566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e not found: ID does not exist" containerID="566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e" Mar 20 11:09:00 crc kubenswrapper[4695]: I0320 11:09:00.969520 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e"} err="failed to get container status \"566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e\": rpc error: code = NotFound desc = could not find container \"566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e\": container with ID starting with 566998108ed5bbc1ea505b1cefebb12c30da43ca4c6f2afe3aa3dc4e952dde7e not found: ID does not exist" Mar 20 11:09:01 crc kubenswrapper[4695]: I0320 11:09:01.003096 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dac6a8e-7989-40f1-b930-40f33a30db9b" (UID: "1dac6a8e-7989-40f1-b930-40f33a30db9b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:01 crc kubenswrapper[4695]: I0320 11:09:01.066344 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dac6a8e-7989-40f1-b930-40f33a30db9b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:01 crc kubenswrapper[4695]: I0320 11:09:01.239286 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8p6fl"] Mar 20 11:09:01 crc kubenswrapper[4695]: I0320 11:09:01.245238 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8p6fl"] Mar 20 11:09:02 crc kubenswrapper[4695]: I0320 11:09:02.895817 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" path="/var/lib/kubelet/pods/1dac6a8e-7989-40f1-b930-40f33a30db9b/volumes" Mar 20 11:09:03 crc kubenswrapper[4695]: I0320 11:09:03.923237 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" event={"ID":"1fd45e0f-120d-45c8-99c5-f8fc5e043df6","Type":"ContainerStarted","Data":"cd8c6bfcad41e0419457a527dca1d84ca56b076e8c9a4084314cef8998fd4ee9"} Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.067481 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-7xvf2" podStartSLOduration=3.684760742 podStartE2EDuration="19.067441363s" podCreationTimestamp="2026-03-20 11:08:46 +0000 UTC" firstStartedPulling="2026-03-20 11:08:47.877011583 +0000 UTC m=+905.657617146" lastFinishedPulling="2026-03-20 11:09:03.259692204 +0000 UTC m=+921.040297767" observedRunningTime="2026-03-20 11:09:03.946751266 +0000 UTC m=+921.727356829" watchObservedRunningTime="2026-03-20 11:09:05.067441363 +0000 UTC m=+922.848046916" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.073616 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-dc646"] Mar 20 11:09:05 crc kubenswrapper[4695]: E0320 11:09:05.073982 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="registry-server" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.074005 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="registry-server" Mar 20 11:09:05 crc kubenswrapper[4695]: E0320 11:09:05.074028 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="extract-content" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.074040 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="extract-content" Mar 20 11:09:05 crc kubenswrapper[4695]: E0320 11:09:05.074057 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="extract-utilities" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.074066 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="extract-utilities" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.074199 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dac6a8e-7989-40f1-b930-40f33a30db9b" containerName="registry-server" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.075038 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.077416 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wx9b6" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.091333 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-dc646"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.097019 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.098091 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.102413 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.106741 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.131992 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-bcl9w"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.133201 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.188843 4695 scope.go:117] "RemoveContainer" containerID="6f14d203e0a09c15d5b46a0a38c7e1ae507907bfda3952dede2c47d6c81cc366" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230054 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tqll\" (UniqueName: \"kubernetes.io/projected/88d329fc-eca2-4b78-8aa7-af0444906ebf-kube-api-access-8tqll\") pod \"nmstate-webhook-5f558f5558-2wb9n\" (UID: \"88d329fc-eca2-4b78-8aa7-af0444906ebf\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230118 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-nmstate-lock\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230163 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-dbus-socket\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230222 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whl26\" (UniqueName: \"kubernetes.io/projected/935a2ea2-68f9-4295-ab4d-53eb5403edfc-kube-api-access-whl26\") pod \"nmstate-metrics-9b8c8685d-dc646\" (UID: \"935a2ea2-68f9-4295-ab4d-53eb5403edfc\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230280 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88d329fc-eca2-4b78-8aa7-af0444906ebf-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2wb9n\" (UID: \"88d329fc-eca2-4b78-8aa7-af0444906ebf\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230306 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzhg\" (UniqueName: \"kubernetes.io/projected/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-kube-api-access-mgzhg\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.230342 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-ovs-socket\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.255792 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.256777 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.259652 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cgmrf" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.259880 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.261107 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.265385 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.331849 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-ovs-socket\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.331990 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tqll\" (UniqueName: \"kubernetes.io/projected/88d329fc-eca2-4b78-8aa7-af0444906ebf-kube-api-access-8tqll\") pod \"nmstate-webhook-5f558f5558-2wb9n\" (UID: \"88d329fc-eca2-4b78-8aa7-af0444906ebf\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332022 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-nmstate-lock\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332096 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-dbus-socket\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332155 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whl26\" (UniqueName: \"kubernetes.io/projected/935a2ea2-68f9-4295-ab4d-53eb5403edfc-kube-api-access-whl26\") pod \"nmstate-metrics-9b8c8685d-dc646\" (UID: \"935a2ea2-68f9-4295-ab4d-53eb5403edfc\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332210 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88d329fc-eca2-4b78-8aa7-af0444906ebf-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2wb9n\" (UID: \"88d329fc-eca2-4b78-8aa7-af0444906ebf\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332236 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgzhg\" (UniqueName: \"kubernetes.io/projected/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-kube-api-access-mgzhg\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332227 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-ovs-socket\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332663 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-dbus-socket\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.332812 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-nmstate-lock\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.341299 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/88d329fc-eca2-4b78-8aa7-af0444906ebf-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-2wb9n\" (UID: \"88d329fc-eca2-4b78-8aa7-af0444906ebf\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.354879 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tqll\" (UniqueName: \"kubernetes.io/projected/88d329fc-eca2-4b78-8aa7-af0444906ebf-kube-api-access-8tqll\") pod \"nmstate-webhook-5f558f5558-2wb9n\" (UID: \"88d329fc-eca2-4b78-8aa7-af0444906ebf\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.355680 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgzhg\" (UniqueName: \"kubernetes.io/projected/136cb8fe-69bb-40ec-badd-ccb28dbf8a49-kube-api-access-mgzhg\") pod \"nmstate-handler-bcl9w\" (UID: \"136cb8fe-69bb-40ec-badd-ccb28dbf8a49\") " pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.364777 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whl26\" (UniqueName: \"kubernetes.io/projected/935a2ea2-68f9-4295-ab4d-53eb5403edfc-kube-api-access-whl26\") pod \"nmstate-metrics-9b8c8685d-dc646\" (UID: \"935a2ea2-68f9-4295-ab4d-53eb5403edfc\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.394614 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.418548 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.451053 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.452738 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/812dca14-f391-4297-831b-007e693807b2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.452870 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chlb2\" (UniqueName: \"kubernetes.io/projected/812dca14-f391-4297-831b-007e693807b2-kube-api-access-chlb2\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.453065 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/812dca14-f391-4297-831b-007e693807b2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.498522 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5678554f8b-hddqh"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.500591 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.512521 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5678554f8b-hddqh"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.529551 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.555362 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/812dca14-f391-4297-831b-007e693807b2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.555426 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chlb2\" (UniqueName: \"kubernetes.io/projected/812dca14-f391-4297-831b-007e693807b2-kube-api-access-chlb2\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.555464 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/812dca14-f391-4297-831b-007e693807b2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: E0320 11:09:05.556173 4695 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 20 11:09:05 crc kubenswrapper[4695]: E0320 11:09:05.556321 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/812dca14-f391-4297-831b-007e693807b2-plugin-serving-cert podName:812dca14-f391-4297-831b-007e693807b2 nodeName:}" failed. No retries permitted until 2026-03-20 11:09:06.05628637 +0000 UTC m=+923.836891933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/812dca14-f391-4297-831b-007e693807b2-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-vs8wz" (UID: "812dca14-f391-4297-831b-007e693807b2") : secret "plugin-serving-cert" not found Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.556743 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/812dca14-f391-4297-831b-007e693807b2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.581133 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chlb2\" (UniqueName: \"kubernetes.io/projected/812dca14-f391-4297-831b-007e693807b2-kube-api-access-chlb2\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.657403 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-service-ca\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.657508 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-trusted-ca-bundle\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.657647 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs7pz\" (UniqueName: \"kubernetes.io/projected/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-kube-api-access-gs7pz\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.657851 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-config\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.658026 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-oauth-config\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.658116 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-serving-cert\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.658370 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-oauth-serving-cert\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.722415 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.752070 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-dc646"] Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759661 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-service-ca\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759709 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-trusted-ca-bundle\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759738 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs7pz\" (UniqueName: \"kubernetes.io/projected/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-kube-api-access-gs7pz\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759778 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-config\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759809 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-oauth-config\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759867 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-serving-cert\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.759890 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-oauth-serving-cert\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.761073 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-service-ca\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.761189 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-trusted-ca-bundle\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.761268 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-oauth-serving-cert\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.761417 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-config\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.766248 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-oauth-config\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.766820 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-console-serving-cert\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.778093 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs7pz\" (UniqueName: \"kubernetes.io/projected/5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0-kube-api-access-gs7pz\") pod \"console-5678554f8b-hddqh\" (UID: \"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0\") " pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.818032 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.947068 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" event={"ID":"88d329fc-eca2-4b78-8aa7-af0444906ebf","Type":"ContainerStarted","Data":"ccb36fe340c0664eaed409839ed7bc2f56146fb0bdc3033bfa5c872cdf6ca67a"} Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.949656 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" event={"ID":"935a2ea2-68f9-4295-ab4d-53eb5403edfc","Type":"ContainerStarted","Data":"9af48956cc3d58562bfb6198456bcb364edf70a205c65ffb645a9c78cbb8402d"} Mar 20 11:09:05 crc kubenswrapper[4695]: I0320 11:09:05.953477 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bcl9w" event={"ID":"136cb8fe-69bb-40ec-badd-ccb28dbf8a49","Type":"ContainerStarted","Data":"a5622184d1b586800074ed289fe3faf8384c6cf6a3be7d2e5457d0df0563161f"} Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.027243 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5678554f8b-hddqh"] Mar 20 11:09:06 crc kubenswrapper[4695]: W0320 11:09:06.040480 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd85d73_df5a_4b7e_8b3c_55fc9467cdb0.slice/crio-431474c02babbd3d9cff09ca8c8fefe14cb4053f40389ac35ac525de00e379aa WatchSource:0}: Error finding container 431474c02babbd3d9cff09ca8c8fefe14cb4053f40389ac35ac525de00e379aa: Status 404 returned error can't find the container with id 431474c02babbd3d9cff09ca8c8fefe14cb4053f40389ac35ac525de00e379aa Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.066678 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/812dca14-f391-4297-831b-007e693807b2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.072970 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/812dca14-f391-4297-831b-007e693807b2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-vs8wz\" (UID: \"812dca14-f391-4297-831b-007e693807b2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.178723 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.592581 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz"] Mar 20 11:09:06 crc kubenswrapper[4695]: W0320 11:09:06.601871 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod812dca14_f391_4297_831b_007e693807b2.slice/crio-41f906399b8f621463981724646a8bd7d4b16485e3d547f7e3133dc6748425d9 WatchSource:0}: Error finding container 41f906399b8f621463981724646a8bd7d4b16485e3d547f7e3133dc6748425d9: Status 404 returned error can't find the container with id 41f906399b8f621463981724646a8bd7d4b16485e3d547f7e3133dc6748425d9 Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.961474 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5678554f8b-hddqh" event={"ID":"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0","Type":"ContainerStarted","Data":"eedf28e012c44bea63f37d554e1ea87b1a5afc30a49d477d537f6c7b2fe04fdc"} Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.961542 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5678554f8b-hddqh" event={"ID":"5fd85d73-df5a-4b7e-8b3c-55fc9467cdb0","Type":"ContainerStarted","Data":"431474c02babbd3d9cff09ca8c8fefe14cb4053f40389ac35ac525de00e379aa"} Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.965347 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" event={"ID":"812dca14-f391-4297-831b-007e693807b2","Type":"ContainerStarted","Data":"41f906399b8f621463981724646a8bd7d4b16485e3d547f7e3133dc6748425d9"} Mar 20 11:09:06 crc kubenswrapper[4695]: I0320 11:09:06.984703 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5678554f8b-hddqh" podStartSLOduration=1.98467821 podStartE2EDuration="1.98467821s" podCreationTimestamp="2026-03-20 11:09:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:09:06.982853673 +0000 UTC m=+924.763459256" watchObservedRunningTime="2026-03-20 11:09:06.98467821 +0000 UTC m=+924.765283773" Mar 20 11:09:08 crc kubenswrapper[4695]: I0320 11:09:08.981628 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-bcl9w" event={"ID":"136cb8fe-69bb-40ec-badd-ccb28dbf8a49","Type":"ContainerStarted","Data":"46602301f53ebf482e1cf3a842ce7982700e1eda4012df2e5c3fceae04889126"} Mar 20 11:09:08 crc kubenswrapper[4695]: I0320 11:09:08.983125 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:08 crc kubenswrapper[4695]: I0320 11:09:08.986117 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" event={"ID":"88d329fc-eca2-4b78-8aa7-af0444906ebf","Type":"ContainerStarted","Data":"7421b20c27afc560f32fe9294da325eeaeb1a2429dfd6d27baf37c2f31f5050e"} Mar 20 11:09:08 crc kubenswrapper[4695]: I0320 11:09:08.987117 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:08 crc kubenswrapper[4695]: I0320 11:09:08.990146 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" event={"ID":"935a2ea2-68f9-4295-ab4d-53eb5403edfc","Type":"ContainerStarted","Data":"592b80da48a4d953377891f64021fab63f1231fd31c02ab386f5e2cae1effdc0"} Mar 20 11:09:09 crc kubenswrapper[4695]: I0320 11:09:09.003157 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-bcl9w" podStartSLOduration=1.219886729 podStartE2EDuration="4.003123704s" podCreationTimestamp="2026-03-20 11:09:05 +0000 UTC" firstStartedPulling="2026-03-20 11:09:05.529188523 +0000 UTC m=+923.309794086" lastFinishedPulling="2026-03-20 11:09:08.312425508 +0000 UTC m=+926.093031061" observedRunningTime="2026-03-20 11:09:09.00297591 +0000 UTC m=+926.783581473" watchObservedRunningTime="2026-03-20 11:09:09.003123704 +0000 UTC m=+926.783729277" Mar 20 11:09:09 crc kubenswrapper[4695]: I0320 11:09:09.029480 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" podStartSLOduration=1.459469825 podStartE2EDuration="4.029448311s" podCreationTimestamp="2026-03-20 11:09:05 +0000 UTC" firstStartedPulling="2026-03-20 11:09:05.741340054 +0000 UTC m=+923.521945627" lastFinishedPulling="2026-03-20 11:09:08.31131855 +0000 UTC m=+926.091924113" observedRunningTime="2026-03-20 11:09:09.027474761 +0000 UTC m=+926.808080324" watchObservedRunningTime="2026-03-20 11:09:09.029448311 +0000 UTC m=+926.810053874" Mar 20 11:09:10 crc kubenswrapper[4695]: I0320 11:09:10.002759 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" event={"ID":"812dca14-f391-4297-831b-007e693807b2","Type":"ContainerStarted","Data":"fc9efc19e1e1b8113f4bfeaf199713eba71a6d62de752b0a35e72c20a6052169"} Mar 20 11:09:10 crc kubenswrapper[4695]: I0320 11:09:10.022681 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-vs8wz" podStartSLOduration=2.177634489 podStartE2EDuration="5.022656495s" podCreationTimestamp="2026-03-20 11:09:05 +0000 UTC" firstStartedPulling="2026-03-20 11:09:06.605411644 +0000 UTC m=+924.386017207" lastFinishedPulling="2026-03-20 11:09:09.45043365 +0000 UTC m=+927.231039213" observedRunningTime="2026-03-20 11:09:10.020616202 +0000 UTC m=+927.801221765" watchObservedRunningTime="2026-03-20 11:09:10.022656495 +0000 UTC m=+927.803262048" Mar 20 11:09:12 crc kubenswrapper[4695]: I0320 11:09:12.019256 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" event={"ID":"935a2ea2-68f9-4295-ab4d-53eb5403edfc","Type":"ContainerStarted","Data":"693d2b15921a3f5b97972aad1dac1fc3000eb806e16b6256f8f9db9ee83a87e9"} Mar 20 11:09:12 crc kubenswrapper[4695]: I0320 11:09:12.048273 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-dc646" podStartSLOduration=1.795952 podStartE2EDuration="7.048236862s" podCreationTimestamp="2026-03-20 11:09:05 +0000 UTC" firstStartedPulling="2026-03-20 11:09:05.758548318 +0000 UTC m=+923.539153881" lastFinishedPulling="2026-03-20 11:09:11.01083318 +0000 UTC m=+928.791438743" observedRunningTime="2026-03-20 11:09:12.042744051 +0000 UTC m=+929.823349614" watchObservedRunningTime="2026-03-20 11:09:12.048236862 +0000 UTC m=+929.828842435" Mar 20 11:09:15 crc kubenswrapper[4695]: I0320 11:09:15.481509 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-bcl9w" Mar 20 11:09:15 crc kubenswrapper[4695]: I0320 11:09:15.818429 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:15 crc kubenswrapper[4695]: I0320 11:09:15.818518 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:15 crc kubenswrapper[4695]: I0320 11:09:15.825175 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:16 crc kubenswrapper[4695]: I0320 11:09:16.049052 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5678554f8b-hddqh" Mar 20 11:09:16 crc kubenswrapper[4695]: I0320 11:09:16.107168 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-s2xcj"] Mar 20 11:09:25 crc kubenswrapper[4695]: I0320 11:09:25.427213 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-2wb9n" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.431608 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.432297 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.514759 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8"] Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.516353 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.520118 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.520174 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8"] Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.607780 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.607877 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wdv\" (UniqueName: \"kubernetes.io/projected/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-kube-api-access-29wdv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.607989 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.708848 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.708942 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29wdv\" (UniqueName: \"kubernetes.io/projected/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-kube-api-access-29wdv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.709014 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.709426 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.709555 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.734961 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wdv\" (UniqueName: \"kubernetes.io/projected/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-kube-api-access-29wdv\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:38 crc kubenswrapper[4695]: I0320 11:09:38.844042 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:39 crc kubenswrapper[4695]: I0320 11:09:39.292636 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8"] Mar 20 11:09:40 crc kubenswrapper[4695]: I0320 11:09:40.212473 4695 generic.go:334] "Generic (PLEG): container finished" podID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerID="f7aca4528423bf97758af09e81624547595d359d6b9e6104ccf93c2a83efb84e" exitCode=0 Mar 20 11:09:40 crc kubenswrapper[4695]: I0320 11:09:40.212552 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" event={"ID":"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d","Type":"ContainerDied","Data":"f7aca4528423bf97758af09e81624547595d359d6b9e6104ccf93c2a83efb84e"} Mar 20 11:09:40 crc kubenswrapper[4695]: I0320 11:09:40.212861 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" event={"ID":"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d","Type":"ContainerStarted","Data":"fa1a65ed46f76f0580bfa9ec8669876d52418afa052c0747af56edc322095a06"} Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.152395 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-s2xcj" podUID="874d0ff7-4923-4423-920a-59e6a632507a" containerName="console" containerID="cri-o://936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb" gracePeriod=15 Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.516346 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-s2xcj_874d0ff7-4923-4423-920a-59e6a632507a/console/0.log" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.516420 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.650920 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-oauth-config\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.651115 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-trusted-ca-bundle\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.651221 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-oauth-serving-cert\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.651264 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-service-ca\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.651289 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-serving-cert\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.651321 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpjln\" (UniqueName: \"kubernetes.io/projected/874d0ff7-4923-4423-920a-59e6a632507a-kube-api-access-rpjln\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.651356 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-console-config\") pod \"874d0ff7-4923-4423-920a-59e6a632507a\" (UID: \"874d0ff7-4923-4423-920a-59e6a632507a\") " Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.652581 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-console-config" (OuterVolumeSpecName: "console-config") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.652594 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.652573 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.652634 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-service-ca" (OuterVolumeSpecName: "service-ca") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.658639 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.658714 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/874d0ff7-4923-4423-920a-59e6a632507a-kube-api-access-rpjln" (OuterVolumeSpecName: "kube-api-access-rpjln") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "kube-api-access-rpjln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.660003 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "874d0ff7-4923-4423-920a-59e6a632507a" (UID: "874d0ff7-4923-4423-920a-59e6a632507a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752677 4695 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752755 4695 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752771 4695 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752781 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpjln\" (UniqueName: \"kubernetes.io/projected/874d0ff7-4923-4423-920a-59e6a632507a-kube-api-access-rpjln\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752790 4695 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752799 4695 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/874d0ff7-4923-4423-920a-59e6a632507a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:41 crc kubenswrapper[4695]: I0320 11:09:41.752807 4695 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/874d0ff7-4923-4423-920a-59e6a632507a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.228218 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-s2xcj_874d0ff7-4923-4423-920a-59e6a632507a/console/0.log" Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.228642 4695 generic.go:334] "Generic (PLEG): container finished" podID="874d0ff7-4923-4423-920a-59e6a632507a" containerID="936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb" exitCode=2 Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.228684 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s2xcj" event={"ID":"874d0ff7-4923-4423-920a-59e6a632507a","Type":"ContainerDied","Data":"936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb"} Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.228728 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-s2xcj" event={"ID":"874d0ff7-4923-4423-920a-59e6a632507a","Type":"ContainerDied","Data":"67cf2527879fe5af1d3fe7701553016fb56a0e7ca4aef8c5b97142a746107ce9"} Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.228748 4695 scope.go:117] "RemoveContainer" containerID="936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb" Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.228800 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-s2xcj" Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.254111 4695 scope.go:117] "RemoveContainer" containerID="936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb" Mar 20 11:09:42 crc kubenswrapper[4695]: E0320 11:09:42.254612 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb\": container with ID starting with 936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb not found: ID does not exist" containerID="936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb" Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.254661 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb"} err="failed to get container status \"936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb\": rpc error: code = NotFound desc = could not find container \"936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb\": container with ID starting with 936f9e8ad7d7aeee8691eef9c50d42dce666dac679cf33d71435f3c3a7ccdccb not found: ID does not exist" Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.266850 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-s2xcj"] Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.271588 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-s2xcj"] Mar 20 11:09:42 crc kubenswrapper[4695]: I0320 11:09:42.898567 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="874d0ff7-4923-4423-920a-59e6a632507a" path="/var/lib/kubelet/pods/874d0ff7-4923-4423-920a-59e6a632507a/volumes" Mar 20 11:09:43 crc kubenswrapper[4695]: I0320 11:09:43.237764 4695 generic.go:334] "Generic (PLEG): container finished" podID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerID="4c32d40e84bf849bacbc71d5e3840631f4e580859dd3692b6d49ed41ccad0413" exitCode=0 Mar 20 11:09:43 crc kubenswrapper[4695]: I0320 11:09:43.237855 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" event={"ID":"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d","Type":"ContainerDied","Data":"4c32d40e84bf849bacbc71d5e3840631f4e580859dd3692b6d49ed41ccad0413"} Mar 20 11:09:44 crc kubenswrapper[4695]: I0320 11:09:44.249025 4695 generic.go:334] "Generic (PLEG): container finished" podID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerID="cb4187d3d0ca85c9d776751f12c4da686dc685b859b53b4fe6ebb16b3415746a" exitCode=0 Mar 20 11:09:44 crc kubenswrapper[4695]: I0320 11:09:44.249100 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" event={"ID":"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d","Type":"ContainerDied","Data":"cb4187d3d0ca85c9d776751f12c4da686dc685b859b53b4fe6ebb16b3415746a"} Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.506706 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.641082 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-bundle\") pod \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.641186 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29wdv\" (UniqueName: \"kubernetes.io/projected/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-kube-api-access-29wdv\") pod \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.641455 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-util\") pod \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\" (UID: \"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d\") " Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.642468 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-bundle" (OuterVolumeSpecName: "bundle") pod "ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" (UID: "ed0aecb5-8bed-4a2b-af31-67cb15eeee8d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.649089 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-kube-api-access-29wdv" (OuterVolumeSpecName: "kube-api-access-29wdv") pod "ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" (UID: "ed0aecb5-8bed-4a2b-af31-67cb15eeee8d"). InnerVolumeSpecName "kube-api-access-29wdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.651359 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-util" (OuterVolumeSpecName: "util") pod "ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" (UID: "ed0aecb5-8bed-4a2b-af31-67cb15eeee8d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.758038 4695 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.758091 4695 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:45 crc kubenswrapper[4695]: I0320 11:09:45.758104 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29wdv\" (UniqueName: \"kubernetes.io/projected/ed0aecb5-8bed-4a2b-af31-67cb15eeee8d-kube-api-access-29wdv\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:46 crc kubenswrapper[4695]: I0320 11:09:46.263167 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" event={"ID":"ed0aecb5-8bed-4a2b-af31-67cb15eeee8d","Type":"ContainerDied","Data":"fa1a65ed46f76f0580bfa9ec8669876d52418afa052c0747af56edc322095a06"} Mar 20 11:09:46 crc kubenswrapper[4695]: I0320 11:09:46.263230 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa1a65ed46f76f0580bfa9ec8669876d52418afa052c0747af56edc322095a06" Mar 20 11:09:46 crc kubenswrapper[4695]: I0320 11:09:46.263313 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.700355 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bbrgl"] Mar 20 11:09:47 crc kubenswrapper[4695]: E0320 11:09:47.701243 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="extract" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.701266 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="extract" Mar 20 11:09:47 crc kubenswrapper[4695]: E0320 11:09:47.701288 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="pull" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.701297 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="pull" Mar 20 11:09:47 crc kubenswrapper[4695]: E0320 11:09:47.701334 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="874d0ff7-4923-4423-920a-59e6a632507a" containerName="console" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.701352 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="874d0ff7-4923-4423-920a-59e6a632507a" containerName="console" Mar 20 11:09:47 crc kubenswrapper[4695]: E0320 11:09:47.701375 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="util" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.701383 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="util" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.701556 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed0aecb5-8bed-4a2b-af31-67cb15eeee8d" containerName="extract" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.701583 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="874d0ff7-4923-4423-920a-59e6a632507a" containerName="console" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.702778 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.712465 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbrgl"] Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.787697 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-utilities\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.787780 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d9j4\" (UniqueName: \"kubernetes.io/projected/53304d76-9511-4108-96ed-45c5815f9d47-kube-api-access-2d9j4\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.787835 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-catalog-content\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.889627 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-utilities\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.889698 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d9j4\" (UniqueName: \"kubernetes.io/projected/53304d76-9511-4108-96ed-45c5815f9d47-kube-api-access-2d9j4\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.889744 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-catalog-content\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.890460 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-utilities\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.890509 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-catalog-content\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:47 crc kubenswrapper[4695]: I0320 11:09:47.914830 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d9j4\" (UniqueName: \"kubernetes.io/projected/53304d76-9511-4108-96ed-45c5815f9d47-kube-api-access-2d9j4\") pod \"certified-operators-bbrgl\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:48 crc kubenswrapper[4695]: I0320 11:09:48.029370 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:48 crc kubenswrapper[4695]: I0320 11:09:48.328455 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bbrgl"] Mar 20 11:09:49 crc kubenswrapper[4695]: I0320 11:09:49.284766 4695 generic.go:334] "Generic (PLEG): container finished" podID="53304d76-9511-4108-96ed-45c5815f9d47" containerID="4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff" exitCode=0 Mar 20 11:09:49 crc kubenswrapper[4695]: I0320 11:09:49.284866 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerDied","Data":"4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff"} Mar 20 11:09:49 crc kubenswrapper[4695]: I0320 11:09:49.285266 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerStarted","Data":"e03de5029f8503620a3c1cc41bee41b9e62ad46d98312b483aafba30261ec5be"} Mar 20 11:09:50 crc kubenswrapper[4695]: I0320 11:09:50.293838 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerStarted","Data":"bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de"} Mar 20 11:09:52 crc kubenswrapper[4695]: I0320 11:09:52.320766 4695 generic.go:334] "Generic (PLEG): container finished" podID="53304d76-9511-4108-96ed-45c5815f9d47" containerID="bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de" exitCode=0 Mar 20 11:09:52 crc kubenswrapper[4695]: I0320 11:09:52.320838 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerDied","Data":"bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de"} Mar 20 11:09:53 crc kubenswrapper[4695]: I0320 11:09:53.331062 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerStarted","Data":"d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976"} Mar 20 11:09:53 crc kubenswrapper[4695]: I0320 11:09:53.356292 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bbrgl" podStartSLOduration=2.73021277 podStartE2EDuration="6.356260538s" podCreationTimestamp="2026-03-20 11:09:47 +0000 UTC" firstStartedPulling="2026-03-20 11:09:49.286399322 +0000 UTC m=+967.067004885" lastFinishedPulling="2026-03-20 11:09:52.91244709 +0000 UTC m=+970.693052653" observedRunningTime="2026-03-20 11:09:53.352972853 +0000 UTC m=+971.133578416" watchObservedRunningTime="2026-03-20 11:09:53.356260538 +0000 UTC m=+971.136866101" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.079528 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58888569cf-fq55h"] Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.080548 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.082748 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-t6m4b" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.083147 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.083301 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.088431 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.088707 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.114243 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58888569cf-fq55h"] Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.262636 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-apiservice-cert\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.262730 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z57vw\" (UniqueName: \"kubernetes.io/projected/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-kube-api-access-z57vw\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.262801 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-webhook-cert\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.364400 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-apiservice-cert\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.364549 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z57vw\" (UniqueName: \"kubernetes.io/projected/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-kube-api-access-z57vw\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.364622 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-webhook-cert\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.374429 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-apiservice-cert\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.385509 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-webhook-cert\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.388685 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z57vw\" (UniqueName: \"kubernetes.io/projected/d154e4d7-d533-4e4a-801a-b5db6b39b5f9-kube-api-access-z57vw\") pod \"metallb-operator-controller-manager-58888569cf-fq55h\" (UID: \"d154e4d7-d533-4e4a-801a-b5db6b39b5f9\") " pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.401564 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.790244 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs"] Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.791970 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:54 crc kubenswrapper[4695]: W0320 11:09:54.798327 4695 reflector.go:561] object-"metallb-system"/"controller-dockercfg-q6skd": failed to list *v1.Secret: secrets "controller-dockercfg-q6skd" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 20 11:09:54 crc kubenswrapper[4695]: E0320 11:09:54.798407 4695 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"controller-dockercfg-q6skd\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"controller-dockercfg-q6skd\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 11:09:54 crc kubenswrapper[4695]: W0320 11:09:54.801072 4695 reflector.go:561] object-"metallb-system"/"metallb-webhook-cert": failed to list *v1.Secret: secrets "metallb-webhook-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 20 11:09:54 crc kubenswrapper[4695]: E0320 11:09:54.801109 4695 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-webhook-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-webhook-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 11:09:54 crc kubenswrapper[4695]: W0320 11:09:54.801215 4695 reflector.go:561] object-"metallb-system"/"metallb-operator-webhook-server-service-cert": failed to list *v1.Secret: secrets "metallb-operator-webhook-server-service-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "metallb-system": no relationship found between node 'crc' and this object Mar 20 11:09:54 crc kubenswrapper[4695]: E0320 11:09:54.801243 4695 reflector.go:158] "Unhandled Error" err="object-\"metallb-system\"/\"metallb-operator-webhook-server-service-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"metallb-operator-webhook-server-service-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"metallb-system\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.873188 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/29bed4c6-fa24-4948-b267-ad2f5827d72f-webhook-cert\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.873308 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/29bed4c6-fa24-4948-b267-ad2f5827d72f-apiservice-cert\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.873369 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d6wd\" (UniqueName: \"kubernetes.io/projected/29bed4c6-fa24-4948-b267-ad2f5827d72f-kube-api-access-8d6wd\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:54 crc kubenswrapper[4695]: I0320 11:09:54.963303 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs"] Mar 20 11:09:55 crc kubenswrapper[4695]: I0320 11:09:55.025780 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/29bed4c6-fa24-4948-b267-ad2f5827d72f-webhook-cert\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:55 crc kubenswrapper[4695]: I0320 11:09:55.025854 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/29bed4c6-fa24-4948-b267-ad2f5827d72f-apiservice-cert\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:55 crc kubenswrapper[4695]: I0320 11:09:55.025888 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d6wd\" (UniqueName: \"kubernetes.io/projected/29bed4c6-fa24-4948-b267-ad2f5827d72f-kube-api-access-8d6wd\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:55 crc kubenswrapper[4695]: I0320 11:09:55.210147 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d6wd\" (UniqueName: \"kubernetes.io/projected/29bed4c6-fa24-4948-b267-ad2f5827d72f-kube-api-access-8d6wd\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:55 crc kubenswrapper[4695]: I0320 11:09:55.623545 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58888569cf-fq55h"] Mar 20 11:09:55 crc kubenswrapper[4695]: W0320 11:09:55.639563 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd154e4d7_d533_4e4a_801a_b5db6b39b5f9.slice/crio-ed80a0c614fbe09c9d58f799fb83bb1f057f17e5476daf1862729cf8336329ff WatchSource:0}: Error finding container ed80a0c614fbe09c9d58f799fb83bb1f057f17e5476daf1862729cf8336329ff: Status 404 returned error can't find the container with id ed80a0c614fbe09c9d58f799fb83bb1f057f17e5476daf1862729cf8336329ff Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.017644 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.033698 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/29bed4c6-fa24-4948-b267-ad2f5827d72f-webhook-cert\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.035509 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/29bed4c6-fa24-4948-b267-ad2f5827d72f-apiservice-cert\") pod \"metallb-operator-webhook-server-774ff5745b-6kwgs\" (UID: \"29bed4c6-fa24-4948-b267-ad2f5827d72f\") " pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.242404 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.306537 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-q6skd" Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.335502 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.508365 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" event={"ID":"d154e4d7-d533-4e4a-801a-b5db6b39b5f9","Type":"ContainerStarted","Data":"ed80a0c614fbe09c9d58f799fb83bb1f057f17e5476daf1862729cf8336329ff"} Mar 20 11:09:56 crc kubenswrapper[4695]: I0320 11:09:56.841618 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs"] Mar 20 11:09:56 crc kubenswrapper[4695]: W0320 11:09:56.849716 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29bed4c6_fa24_4948_b267_ad2f5827d72f.slice/crio-c638a244edee23dee655824e294fa8b62ea4b9e8027c758d7a047b5106d140cb WatchSource:0}: Error finding container c638a244edee23dee655824e294fa8b62ea4b9e8027c758d7a047b5106d140cb: Status 404 returned error can't find the container with id c638a244edee23dee655824e294fa8b62ea4b9e8027c758d7a047b5106d140cb Mar 20 11:09:57 crc kubenswrapper[4695]: I0320 11:09:57.517029 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" event={"ID":"29bed4c6-fa24-4948-b267-ad2f5827d72f","Type":"ContainerStarted","Data":"c638a244edee23dee655824e294fa8b62ea4b9e8027c758d7a047b5106d140cb"} Mar 20 11:09:58 crc kubenswrapper[4695]: I0320 11:09:58.030541 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:58 crc kubenswrapper[4695]: I0320 11:09:58.030589 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:58 crc kubenswrapper[4695]: I0320 11:09:58.105791 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:09:58 crc kubenswrapper[4695]: I0320 11:09:58.697264 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.131344 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566750-8w627"] Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.133347 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.136814 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.137386 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.137841 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.145387 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-8w627"] Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.233796 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbrgl"] Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.317074 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnrpw\" (UniqueName: \"kubernetes.io/projected/b7bded3b-99e6-4985-8928-f58fc6203a14-kube-api-access-xnrpw\") pod \"auto-csr-approver-29566750-8w627\" (UID: \"b7bded3b-99e6-4985-8928-f58fc6203a14\") " pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.418852 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnrpw\" (UniqueName: \"kubernetes.io/projected/b7bded3b-99e6-4985-8928-f58fc6203a14-kube-api-access-xnrpw\") pod \"auto-csr-approver-29566750-8w627\" (UID: \"b7bded3b-99e6-4985-8928-f58fc6203a14\") " pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.538772 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnrpw\" (UniqueName: \"kubernetes.io/projected/b7bded3b-99e6-4985-8928-f58fc6203a14-kube-api-access-xnrpw\") pod \"auto-csr-approver-29566750-8w627\" (UID: \"b7bded3b-99e6-4985-8928-f58fc6203a14\") " pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.543249 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bbrgl" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="registry-server" containerID="cri-o://d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976" gracePeriod=2 Mar 20 11:10:00 crc kubenswrapper[4695]: I0320 11:10:00.759754 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.010056 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.074993 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-8w627"] Mar 20 11:10:01 crc kubenswrapper[4695]: W0320 11:10:01.085440 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7bded3b_99e6_4985_8928_f58fc6203a14.slice/crio-775f87052d178a4d1b6f72999b695cff48d51c8fa242d1d31e46ebc1ad0651d6 WatchSource:0}: Error finding container 775f87052d178a4d1b6f72999b695cff48d51c8fa242d1d31e46ebc1ad0651d6: Status 404 returned error can't find the container with id 775f87052d178a4d1b6f72999b695cff48d51c8fa242d1d31e46ebc1ad0651d6 Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.139175 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d9j4\" (UniqueName: \"kubernetes.io/projected/53304d76-9511-4108-96ed-45c5815f9d47-kube-api-access-2d9j4\") pod \"53304d76-9511-4108-96ed-45c5815f9d47\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.139299 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-utilities\") pod \"53304d76-9511-4108-96ed-45c5815f9d47\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.139368 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-catalog-content\") pod \"53304d76-9511-4108-96ed-45c5815f9d47\" (UID: \"53304d76-9511-4108-96ed-45c5815f9d47\") " Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.140698 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-utilities" (OuterVolumeSpecName: "utilities") pod "53304d76-9511-4108-96ed-45c5815f9d47" (UID: "53304d76-9511-4108-96ed-45c5815f9d47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.147509 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53304d76-9511-4108-96ed-45c5815f9d47-kube-api-access-2d9j4" (OuterVolumeSpecName: "kube-api-access-2d9j4") pod "53304d76-9511-4108-96ed-45c5815f9d47" (UID: "53304d76-9511-4108-96ed-45c5815f9d47"). InnerVolumeSpecName "kube-api-access-2d9j4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.240742 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d9j4\" (UniqueName: \"kubernetes.io/projected/53304d76-9511-4108-96ed-45c5815f9d47-kube-api-access-2d9j4\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.240798 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.553455 4695 generic.go:334] "Generic (PLEG): container finished" podID="53304d76-9511-4108-96ed-45c5815f9d47" containerID="d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976" exitCode=0 Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.553524 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bbrgl" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.553544 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerDied","Data":"d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976"} Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.553577 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bbrgl" event={"ID":"53304d76-9511-4108-96ed-45c5815f9d47","Type":"ContainerDied","Data":"e03de5029f8503620a3c1cc41bee41b9e62ad46d98312b483aafba30261ec5be"} Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.553598 4695 scope.go:117] "RemoveContainer" containerID="d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.557670 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-8w627" event={"ID":"b7bded3b-99e6-4985-8928-f58fc6203a14","Type":"ContainerStarted","Data":"775f87052d178a4d1b6f72999b695cff48d51c8fa242d1d31e46ebc1ad0651d6"} Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.560426 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" event={"ID":"d154e4d7-d533-4e4a-801a-b5db6b39b5f9","Type":"ContainerStarted","Data":"20561d8fc6f753631c780f1b5aec89d52637f47952d35149d1edde552ed5fe5f"} Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.561307 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.583441 4695 scope.go:117] "RemoveContainer" containerID="bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.587207 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" podStartSLOduration=2.438232019 podStartE2EDuration="7.587179664s" podCreationTimestamp="2026-03-20 11:09:54 +0000 UTC" firstStartedPulling="2026-03-20 11:09:55.643723127 +0000 UTC m=+973.424328690" lastFinishedPulling="2026-03-20 11:10:00.792670772 +0000 UTC m=+978.573276335" observedRunningTime="2026-03-20 11:10:01.58665045 +0000 UTC m=+979.367256013" watchObservedRunningTime="2026-03-20 11:10:01.587179664 +0000 UTC m=+979.367785227" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.610672 4695 scope.go:117] "RemoveContainer" containerID="4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.634562 4695 scope.go:117] "RemoveContainer" containerID="d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976" Mar 20 11:10:01 crc kubenswrapper[4695]: E0320 11:10:01.636202 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976\": container with ID starting with d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976 not found: ID does not exist" containerID="d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.636276 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976"} err="failed to get container status \"d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976\": rpc error: code = NotFound desc = could not find container \"d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976\": container with ID starting with d7cd6f60f0131c910c6836b8c0ba9bf78029d9c7c48045032bdbd603b23f6976 not found: ID does not exist" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.636312 4695 scope.go:117] "RemoveContainer" containerID="bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de" Mar 20 11:10:01 crc kubenswrapper[4695]: E0320 11:10:01.636733 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de\": container with ID starting with bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de not found: ID does not exist" containerID="bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.636781 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de"} err="failed to get container status \"bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de\": rpc error: code = NotFound desc = could not find container \"bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de\": container with ID starting with bd5c7daebdadef479c76dd733db9238a6f2c6db3423e6251da1f89d2aa0204de not found: ID does not exist" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.636820 4695 scope.go:117] "RemoveContainer" containerID="4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff" Mar 20 11:10:01 crc kubenswrapper[4695]: E0320 11:10:01.637611 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff\": container with ID starting with 4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff not found: ID does not exist" containerID="4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff" Mar 20 11:10:01 crc kubenswrapper[4695]: I0320 11:10:01.637634 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff"} err="failed to get container status \"4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff\": rpc error: code = NotFound desc = could not find container \"4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff\": container with ID starting with 4ae15098bc58b09dcad782143c72934ff6a2f8e83ef487201aa517cfcd9270ff not found: ID does not exist" Mar 20 11:10:02 crc kubenswrapper[4695]: I0320 11:10:02.063630 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "53304d76-9511-4108-96ed-45c5815f9d47" (UID: "53304d76-9511-4108-96ed-45c5815f9d47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:02 crc kubenswrapper[4695]: I0320 11:10:02.172162 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/53304d76-9511-4108-96ed-45c5815f9d47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:02 crc kubenswrapper[4695]: I0320 11:10:02.197063 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bbrgl"] Mar 20 11:10:02 crc kubenswrapper[4695]: I0320 11:10:02.200901 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bbrgl"] Mar 20 11:10:02 crc kubenswrapper[4695]: I0320 11:10:02.898436 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53304d76-9511-4108-96ed-45c5815f9d47" path="/var/lib/kubelet/pods/53304d76-9511-4108-96ed-45c5815f9d47/volumes" Mar 20 11:10:04 crc kubenswrapper[4695]: I0320 11:10:04.614985 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" event={"ID":"29bed4c6-fa24-4948-b267-ad2f5827d72f","Type":"ContainerStarted","Data":"7d8840cf824de94498de1f1859892ea1daef00af095d9b46e8fac622eac78c8f"} Mar 20 11:10:04 crc kubenswrapper[4695]: I0320 11:10:04.616347 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:10:04 crc kubenswrapper[4695]: I0320 11:10:04.644745 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" podStartSLOduration=3.090511615 podStartE2EDuration="10.644719198s" podCreationTimestamp="2026-03-20 11:09:54 +0000 UTC" firstStartedPulling="2026-03-20 11:09:56.853642607 +0000 UTC m=+974.634248170" lastFinishedPulling="2026-03-20 11:10:04.40785018 +0000 UTC m=+982.188455753" observedRunningTime="2026-03-20 11:10:04.643893116 +0000 UTC m=+982.424498679" watchObservedRunningTime="2026-03-20 11:10:04.644719198 +0000 UTC m=+982.425324751" Mar 20 11:10:05 crc kubenswrapper[4695]: I0320 11:10:05.623821 4695 generic.go:334] "Generic (PLEG): container finished" podID="b7bded3b-99e6-4985-8928-f58fc6203a14" containerID="86eca8eb6dc04e2d8ed1b50291cf7501083ecbb722c7cd38c692c89ce9518d61" exitCode=0 Mar 20 11:10:05 crc kubenswrapper[4695]: I0320 11:10:05.623902 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-8w627" event={"ID":"b7bded3b-99e6-4985-8928-f58fc6203a14","Type":"ContainerDied","Data":"86eca8eb6dc04e2d8ed1b50291cf7501083ecbb722c7cd38c692c89ce9518d61"} Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.188252 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.294369 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnrpw\" (UniqueName: \"kubernetes.io/projected/b7bded3b-99e6-4985-8928-f58fc6203a14-kube-api-access-xnrpw\") pod \"b7bded3b-99e6-4985-8928-f58fc6203a14\" (UID: \"b7bded3b-99e6-4985-8928-f58fc6203a14\") " Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.312115 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7bded3b-99e6-4985-8928-f58fc6203a14-kube-api-access-xnrpw" (OuterVolumeSpecName: "kube-api-access-xnrpw") pod "b7bded3b-99e6-4985-8928-f58fc6203a14" (UID: "b7bded3b-99e6-4985-8928-f58fc6203a14"). InnerVolumeSpecName "kube-api-access-xnrpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.396179 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnrpw\" (UniqueName: \"kubernetes.io/projected/b7bded3b-99e6-4985-8928-f58fc6203a14-kube-api-access-xnrpw\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.638108 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-8w627" event={"ID":"b7bded3b-99e6-4985-8928-f58fc6203a14","Type":"ContainerDied","Data":"775f87052d178a4d1b6f72999b695cff48d51c8fa242d1d31e46ebc1ad0651d6"} Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.638161 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="775f87052d178a4d1b6f72999b695cff48d51c8fa242d1d31e46ebc1ad0651d6" Mar 20 11:10:07 crc kubenswrapper[4695]: I0320 11:10:07.638178 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-8w627" Mar 20 11:10:08 crc kubenswrapper[4695]: I0320 11:10:08.256767 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-wbhsz"] Mar 20 11:10:08 crc kubenswrapper[4695]: I0320 11:10:08.261339 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-wbhsz"] Mar 20 11:10:08 crc kubenswrapper[4695]: I0320 11:10:08.441290 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:10:08 crc kubenswrapper[4695]: I0320 11:10:08.441359 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:10:08 crc kubenswrapper[4695]: I0320 11:10:08.893264 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96" path="/var/lib/kubelet/pods/2e0d246a-dbac-4f18-8f90-2f8c7a4fcc96/volumes" Mar 20 11:10:16 crc kubenswrapper[4695]: I0320 11:10:16.342706 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-774ff5745b-6kwgs" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.481724 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxx2b"] Mar 20 11:10:26 crc kubenswrapper[4695]: E0320 11:10:26.482806 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="registry-server" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.482821 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="registry-server" Mar 20 11:10:26 crc kubenswrapper[4695]: E0320 11:10:26.482835 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7bded3b-99e6-4985-8928-f58fc6203a14" containerName="oc" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.482847 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7bded3b-99e6-4985-8928-f58fc6203a14" containerName="oc" Mar 20 11:10:26 crc kubenswrapper[4695]: E0320 11:10:26.482863 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="extract-content" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.482870 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="extract-content" Mar 20 11:10:26 crc kubenswrapper[4695]: E0320 11:10:26.482882 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="extract-utilities" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.482888 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="extract-utilities" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.483018 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7bded3b-99e6-4985-8928-f58fc6203a14" containerName="oc" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.483032 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="53304d76-9511-4108-96ed-45c5815f9d47" containerName="registry-server" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.484077 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.501688 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxx2b"] Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.603011 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-catalog-content\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.603100 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgtk5\" (UniqueName: \"kubernetes.io/projected/acf5296f-8406-4d9b-be44-99fd0970895a-kube-api-access-xgtk5\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.603119 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-utilities\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.704989 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-catalog-content\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.705068 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgtk5\" (UniqueName: \"kubernetes.io/projected/acf5296f-8406-4d9b-be44-99fd0970895a-kube-api-access-xgtk5\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.705087 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-utilities\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.705682 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-utilities\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.706025 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-catalog-content\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.731318 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgtk5\" (UniqueName: \"kubernetes.io/projected/acf5296f-8406-4d9b-be44-99fd0970895a-kube-api-access-xgtk5\") pod \"community-operators-sxx2b\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:26 crc kubenswrapper[4695]: I0320 11:10:26.804120 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:27 crc kubenswrapper[4695]: I0320 11:10:27.329388 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxx2b"] Mar 20 11:10:27 crc kubenswrapper[4695]: I0320 11:10:27.764772 4695 generic.go:334] "Generic (PLEG): container finished" podID="acf5296f-8406-4d9b-be44-99fd0970895a" containerID="0113404c9aa3b10a49d616359cb2605d758bcbc9cb36f933641030fa4bf0dca3" exitCode=0 Mar 20 11:10:27 crc kubenswrapper[4695]: I0320 11:10:27.764847 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerDied","Data":"0113404c9aa3b10a49d616359cb2605d758bcbc9cb36f933641030fa4bf0dca3"} Mar 20 11:10:27 crc kubenswrapper[4695]: I0320 11:10:27.764896 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerStarted","Data":"371561f4230b65ef2923e8c5f67f37dabd5109739483cc8ccfec32010d5831f2"} Mar 20 11:10:28 crc kubenswrapper[4695]: I0320 11:10:28.773299 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerStarted","Data":"8c32a73009d947d77b0e1f9b2af83c42f04ccdc117518be8263f97d7ed47e23f"} Mar 20 11:10:29 crc kubenswrapper[4695]: I0320 11:10:29.784178 4695 generic.go:334] "Generic (PLEG): container finished" podID="acf5296f-8406-4d9b-be44-99fd0970895a" containerID="8c32a73009d947d77b0e1f9b2af83c42f04ccdc117518be8263f97d7ed47e23f" exitCode=0 Mar 20 11:10:29 crc kubenswrapper[4695]: I0320 11:10:29.784308 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerDied","Data":"8c32a73009d947d77b0e1f9b2af83c42f04ccdc117518be8263f97d7ed47e23f"} Mar 20 11:10:31 crc kubenswrapper[4695]: I0320 11:10:31.493093 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerStarted","Data":"d73ef470147571a8fd982fa24f9627746cf419676f1ff560b879a7bf896ec97a"} Mar 20 11:10:31 crc kubenswrapper[4695]: I0320 11:10:31.538510 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxx2b" podStartSLOduration=3.115691816 podStartE2EDuration="5.538479605s" podCreationTimestamp="2026-03-20 11:10:26 +0000 UTC" firstStartedPulling="2026-03-20 11:10:27.76692742 +0000 UTC m=+1005.547532983" lastFinishedPulling="2026-03-20 11:10:30.189715209 +0000 UTC m=+1007.970320772" observedRunningTime="2026-03-20 11:10:31.534764257 +0000 UTC m=+1009.315369820" watchObservedRunningTime="2026-03-20 11:10:31.538479605 +0000 UTC m=+1009.319085168" Mar 20 11:10:34 crc kubenswrapper[4695]: I0320 11:10:34.408453 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58888569cf-fq55h" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.200091 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-kkzsx"] Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.203000 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.205110 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.205459 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.206991 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2mcvj" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.209790 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4"] Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.211214 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.212978 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.226880 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4"] Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.301577 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd87a6c8-0020-4bfd-b9fb-f75010d387e8-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gjhk4\" (UID: \"bd87a6c8-0020-4bfd-b9fb-f75010d387e8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.301657 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-metrics\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.301864 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-reloader\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.301992 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-sockets\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.302070 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kbzm\" (UniqueName: \"kubernetes.io/projected/bd87a6c8-0020-4bfd-b9fb-f75010d387e8-kube-api-access-7kbzm\") pod \"frr-k8s-webhook-server-bcc4b6f68-gjhk4\" (UID: \"bd87a6c8-0020-4bfd-b9fb-f75010d387e8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.302271 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fbeb433-d5f6-4984-a464-2783df4ccd70-metrics-certs\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.302395 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-startup\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.302521 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-conf\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.302636 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stxz\" (UniqueName: \"kubernetes.io/projected/0fbeb433-d5f6-4984-a464-2783df4ccd70-kube-api-access-8stxz\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.310690 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-b88xw"] Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.311734 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.314657 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.315611 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.315640 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.315678 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-7hn4b" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.345076 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-clkw8"] Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.348685 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.353338 4695 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.358057 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-clkw8"] Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405018 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjbdn\" (UniqueName: \"kubernetes.io/projected/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-kube-api-access-pjbdn\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405097 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-reloader\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405141 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-sockets\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405192 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kbzm\" (UniqueName: \"kubernetes.io/projected/bd87a6c8-0020-4bfd-b9fb-f75010d387e8-kube-api-access-7kbzm\") pod \"frr-k8s-webhook-server-bcc4b6f68-gjhk4\" (UID: \"bd87a6c8-0020-4bfd-b9fb-f75010d387e8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405232 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-metallb-excludel2\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405272 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fbeb433-d5f6-4984-a464-2783df4ccd70-metrics-certs\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405302 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-metrics-certs\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405337 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-startup\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405623 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405663 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-conf\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405787 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8stxz\" (UniqueName: \"kubernetes.io/projected/0fbeb433-d5f6-4984-a464-2783df4ccd70-kube-api-access-8stxz\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405949 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd87a6c8-0020-4bfd-b9fb-f75010d387e8-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gjhk4\" (UID: \"bd87a6c8-0020-4bfd-b9fb-f75010d387e8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.405988 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-metrics\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.406011 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-reloader\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.406200 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-sockets\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.406233 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-conf\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.406499 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0fbeb433-d5f6-4984-a464-2783df4ccd70-metrics\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.406972 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0fbeb433-d5f6-4984-a464-2783df4ccd70-frr-startup\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.414712 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd87a6c8-0020-4bfd-b9fb-f75010d387e8-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-gjhk4\" (UID: \"bd87a6c8-0020-4bfd-b9fb-f75010d387e8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.419652 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0fbeb433-d5f6-4984-a464-2783df4ccd70-metrics-certs\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.425956 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kbzm\" (UniqueName: \"kubernetes.io/projected/bd87a6c8-0020-4bfd-b9fb-f75010d387e8-kube-api-access-7kbzm\") pod \"frr-k8s-webhook-server-bcc4b6f68-gjhk4\" (UID: \"bd87a6c8-0020-4bfd-b9fb-f75010d387e8\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.429633 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stxz\" (UniqueName: \"kubernetes.io/projected/0fbeb433-d5f6-4984-a464-2783df4ccd70-kube-api-access-8stxz\") pod \"frr-k8s-kkzsx\" (UID: \"0fbeb433-d5f6-4984-a464-2783df4ccd70\") " pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507467 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-metallb-excludel2\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507544 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-metrics-certs\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507593 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z2ql\" (UniqueName: \"kubernetes.io/projected/e3f91838-e214-4456-880e-642525f5c721-kube-api-access-4z2ql\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507632 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507667 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3f91838-e214-4456-880e-642525f5c721-metrics-certs\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507737 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjbdn\" (UniqueName: \"kubernetes.io/projected/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-kube-api-access-pjbdn\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.507775 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3f91838-e214-4456-880e-642525f5c721-cert\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: E0320 11:10:35.508611 4695 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:10:35 crc kubenswrapper[4695]: E0320 11:10:35.508737 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist podName:2a5cc6f9-e338-4a31-a0ed-b29ed14383cf nodeName:}" failed. No retries permitted until 2026-03-20 11:10:36.008698242 +0000 UTC m=+1013.789303805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist") pod "speaker-b88xw" (UID: "2a5cc6f9-e338-4a31-a0ed-b29ed14383cf") : secret "metallb-memberlist" not found Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.509008 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-metallb-excludel2\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.514289 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-metrics-certs\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.524947 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.533149 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjbdn\" (UniqueName: \"kubernetes.io/projected/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-kube-api-access-pjbdn\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.534730 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.609410 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z2ql\" (UniqueName: \"kubernetes.io/projected/e3f91838-e214-4456-880e-642525f5c721-kube-api-access-4z2ql\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.609524 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3f91838-e214-4456-880e-642525f5c721-metrics-certs\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.609785 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3f91838-e214-4456-880e-642525f5c721-cert\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.615284 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e3f91838-e214-4456-880e-642525f5c721-metrics-certs\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.618152 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3f91838-e214-4456-880e-642525f5c721-cert\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.631757 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z2ql\" (UniqueName: \"kubernetes.io/projected/e3f91838-e214-4456-880e-642525f5c721-kube-api-access-4z2ql\") pod \"controller-7bb4cc7c98-clkw8\" (UID: \"e3f91838-e214-4456-880e-642525f5c721\") " pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.666280 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:35 crc kubenswrapper[4695]: I0320 11:10:35.947363 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4"] Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.015884 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:36 crc kubenswrapper[4695]: E0320 11:10:36.015986 4695 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:10:36 crc kubenswrapper[4695]: E0320 11:10:36.016123 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist podName:2a5cc6f9-e338-4a31-a0ed-b29ed14383cf nodeName:}" failed. No retries permitted until 2026-03-20 11:10:37.016097273 +0000 UTC m=+1014.796702836 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist") pod "speaker-b88xw" (UID: "2a5cc6f9-e338-4a31-a0ed-b29ed14383cf") : secret "metallb-memberlist" not found Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.026142 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-clkw8"] Mar 20 11:10:36 crc kubenswrapper[4695]: W0320 11:10:36.031816 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3f91838_e214_4456_880e_642525f5c721.slice/crio-3fa95d02ee6d742a2828e74b3dff9e97c7f88ec265c72b2cf42517545e52bf33 WatchSource:0}: Error finding container 3fa95d02ee6d742a2828e74b3dff9e97c7f88ec265c72b2cf42517545e52bf33: Status 404 returned error can't find the container with id 3fa95d02ee6d742a2828e74b3dff9e97c7f88ec265c72b2cf42517545e52bf33 Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.532657 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"782976b7b18c1fd08dfce0cb813daf8ccd59a64da60161265dd5eb0a96bee210"} Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.535520 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-clkw8" event={"ID":"e3f91838-e214-4456-880e-642525f5c721","Type":"ContainerStarted","Data":"0154f5a7e9eadb18443ebfa777b6e93c6cd22b8b63377d815a5a891ed88996d3"} Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.535555 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-clkw8" event={"ID":"e3f91838-e214-4456-880e-642525f5c721","Type":"ContainerStarted","Data":"2db07843ff131187690c7856030be289831bdf447159aa3e09c9557067ee7066"} Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.535566 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-clkw8" event={"ID":"e3f91838-e214-4456-880e-642525f5c721","Type":"ContainerStarted","Data":"3fa95d02ee6d742a2828e74b3dff9e97c7f88ec265c72b2cf42517545e52bf33"} Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.550204 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.555512 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" event={"ID":"bd87a6c8-0020-4bfd-b9fb-f75010d387e8","Type":"ContainerStarted","Data":"3fc70ba4b3b019e1011243e20d5e674ce41e683d32c0c2d0510343b65f0ba115"} Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.804932 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.807056 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:36 crc kubenswrapper[4695]: I0320 11:10:36.862540 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.059018 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-clkw8" podStartSLOduration=2.059000826 podStartE2EDuration="2.059000826s" podCreationTimestamp="2026-03-20 11:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:10:37.052094004 +0000 UTC m=+1014.832699577" watchObservedRunningTime="2026-03-20 11:10:37.059000826 +0000 UTC m=+1014.839606389" Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.066309 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.079579 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2a5cc6f9-e338-4a31-a0ed-b29ed14383cf-memberlist\") pod \"speaker-b88xw\" (UID: \"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf\") " pod="metallb-system/speaker-b88xw" Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.129059 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-b88xw" Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.583122 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b88xw" event={"ID":"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf","Type":"ContainerStarted","Data":"2151d75cf509e64588475fbe7659a36cf65200f2b8af4375fa1d47d38cf5d4bb"} Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.583672 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b88xw" event={"ID":"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf","Type":"ContainerStarted","Data":"3463d1e9cd1e569c8af0c55202e67b81aa03274cec2dbf9433ac82e8ab43be4a"} Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.643458 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:37 crc kubenswrapper[4695]: I0320 11:10:37.741090 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxx2b"] Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.430802 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.431339 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.431407 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.432283 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7f145b88a381dab0a3af9969335c65e981dd0f3a0b106c999ecbed0c035eef5"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.432381 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://b7f145b88a381dab0a3af9969335c65e981dd0f3a0b106c999ecbed0c035eef5" gracePeriod=600 Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.602079 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-b88xw" event={"ID":"2a5cc6f9-e338-4a31-a0ed-b29ed14383cf","Type":"ContainerStarted","Data":"f1e50396c0290ab53104566e8d01a31beb629031dac965338f79ce2fca3a2da7"} Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.602706 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-b88xw" Mar 20 11:10:38 crc kubenswrapper[4695]: I0320 11:10:38.630293 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-b88xw" podStartSLOduration=3.630269053 podStartE2EDuration="3.630269053s" podCreationTimestamp="2026-03-20 11:10:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:10:38.625786635 +0000 UTC m=+1016.406392208" watchObservedRunningTime="2026-03-20 11:10:38.630269053 +0000 UTC m=+1016.410874616" Mar 20 11:10:39 crc kubenswrapper[4695]: I0320 11:10:39.613153 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="b7f145b88a381dab0a3af9969335c65e981dd0f3a0b106c999ecbed0c035eef5" exitCode=0 Mar 20 11:10:39 crc kubenswrapper[4695]: I0320 11:10:39.613368 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"b7f145b88a381dab0a3af9969335c65e981dd0f3a0b106c999ecbed0c035eef5"} Mar 20 11:10:39 crc kubenswrapper[4695]: I0320 11:10:39.613976 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"cceac66b33a60ba76fb29486822ed6970274dda5bcbe64eb92732cba195eadd4"} Mar 20 11:10:39 crc kubenswrapper[4695]: I0320 11:10:39.614026 4695 scope.go:117] "RemoveContainer" containerID="2df2ef181c2d99312310276228ee6486ca56cb58973b41cd3d1cfa930619b521" Mar 20 11:10:39 crc kubenswrapper[4695]: I0320 11:10:39.614727 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxx2b" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="registry-server" containerID="cri-o://d73ef470147571a8fd982fa24f9627746cf419676f1ff560b879a7bf896ec97a" gracePeriod=2 Mar 20 11:10:40 crc kubenswrapper[4695]: I0320 11:10:40.639358 4695 generic.go:334] "Generic (PLEG): container finished" podID="acf5296f-8406-4d9b-be44-99fd0970895a" containerID="d73ef470147571a8fd982fa24f9627746cf419676f1ff560b879a7bf896ec97a" exitCode=0 Mar 20 11:10:40 crc kubenswrapper[4695]: I0320 11:10:40.639444 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerDied","Data":"d73ef470147571a8fd982fa24f9627746cf419676f1ff560b879a7bf896ec97a"} Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.214470 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.221736 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-utilities\") pod \"acf5296f-8406-4d9b-be44-99fd0970895a\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.221827 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgtk5\" (UniqueName: \"kubernetes.io/projected/acf5296f-8406-4d9b-be44-99fd0970895a-kube-api-access-xgtk5\") pod \"acf5296f-8406-4d9b-be44-99fd0970895a\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.221862 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-catalog-content\") pod \"acf5296f-8406-4d9b-be44-99fd0970895a\" (UID: \"acf5296f-8406-4d9b-be44-99fd0970895a\") " Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.223357 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-utilities" (OuterVolumeSpecName: "utilities") pod "acf5296f-8406-4d9b-be44-99fd0970895a" (UID: "acf5296f-8406-4d9b-be44-99fd0970895a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.235711 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acf5296f-8406-4d9b-be44-99fd0970895a-kube-api-access-xgtk5" (OuterVolumeSpecName: "kube-api-access-xgtk5") pod "acf5296f-8406-4d9b-be44-99fd0970895a" (UID: "acf5296f-8406-4d9b-be44-99fd0970895a"). InnerVolumeSpecName "kube-api-access-xgtk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.286468 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acf5296f-8406-4d9b-be44-99fd0970895a" (UID: "acf5296f-8406-4d9b-be44-99fd0970895a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.326097 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.326142 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgtk5\" (UniqueName: \"kubernetes.io/projected/acf5296f-8406-4d9b-be44-99fd0970895a-kube-api-access-xgtk5\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.326158 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acf5296f-8406-4d9b-be44-99fd0970895a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.663209 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxx2b" event={"ID":"acf5296f-8406-4d9b-be44-99fd0970895a","Type":"ContainerDied","Data":"371561f4230b65ef2923e8c5f67f37dabd5109739483cc8ccfec32010d5831f2"} Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.663573 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxx2b" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.663714 4695 scope.go:117] "RemoveContainer" containerID="d73ef470147571a8fd982fa24f9627746cf419676f1ff560b879a7bf896ec97a" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.708852 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxx2b"] Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.718025 4695 scope.go:117] "RemoveContainer" containerID="8c32a73009d947d77b0e1f9b2af83c42f04ccdc117518be8263f97d7ed47e23f" Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.724062 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxx2b"] Mar 20 11:10:41 crc kubenswrapper[4695]: I0320 11:10:41.759254 4695 scope.go:117] "RemoveContainer" containerID="0113404c9aa3b10a49d616359cb2605d758bcbc9cb36f933641030fa4bf0dca3" Mar 20 11:10:42 crc kubenswrapper[4695]: I0320 11:10:42.904480 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" path="/var/lib/kubelet/pods/acf5296f-8406-4d9b-be44-99fd0970895a/volumes" Mar 20 11:10:47 crc kubenswrapper[4695]: I0320 11:10:47.134244 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-b88xw" Mar 20 11:10:47 crc kubenswrapper[4695]: I0320 11:10:47.722697 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" event={"ID":"bd87a6c8-0020-4bfd-b9fb-f75010d387e8","Type":"ContainerStarted","Data":"a1a0e56b0bb8f15cb915bff49de8701b1c2d972fccff97c686be9d0427b63288"} Mar 20 11:10:47 crc kubenswrapper[4695]: I0320 11:10:47.723441 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:10:47 crc kubenswrapper[4695]: I0320 11:10:47.724964 4695 generic.go:334] "Generic (PLEG): container finished" podID="0fbeb433-d5f6-4984-a464-2783df4ccd70" containerID="24044a246904a4859a0207fe659636c3975816ca7e989f8f3a881f6440b3dec2" exitCode=0 Mar 20 11:10:47 crc kubenswrapper[4695]: I0320 11:10:47.725020 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerDied","Data":"24044a246904a4859a0207fe659636c3975816ca7e989f8f3a881f6440b3dec2"} Mar 20 11:10:47 crc kubenswrapper[4695]: I0320 11:10:47.748796 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" podStartSLOduration=1.737374506 podStartE2EDuration="12.748768838s" podCreationTimestamp="2026-03-20 11:10:35 +0000 UTC" firstStartedPulling="2026-03-20 11:10:35.959859212 +0000 UTC m=+1013.740464775" lastFinishedPulling="2026-03-20 11:10:46.971253544 +0000 UTC m=+1024.751859107" observedRunningTime="2026-03-20 11:10:47.745238255 +0000 UTC m=+1025.525843808" watchObservedRunningTime="2026-03-20 11:10:47.748768838 +0000 UTC m=+1025.529374401" Mar 20 11:10:48 crc kubenswrapper[4695]: I0320 11:10:48.734341 4695 generic.go:334] "Generic (PLEG): container finished" podID="0fbeb433-d5f6-4984-a464-2783df4ccd70" containerID="274c29214e2bc0ddc697492998563b44cf46b6ea2ea87262caf090872a63f4ae" exitCode=0 Mar 20 11:10:48 crc kubenswrapper[4695]: I0320 11:10:48.734454 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerDied","Data":"274c29214e2bc0ddc697492998563b44cf46b6ea2ea87262caf090872a63f4ae"} Mar 20 11:10:49 crc kubenswrapper[4695]: I0320 11:10:49.743591 4695 generic.go:334] "Generic (PLEG): container finished" podID="0fbeb433-d5f6-4984-a464-2783df4ccd70" containerID="e51702a5debd2693bae41fd82d4dbf4b49d1797063f6dc225112c2a1223b68a2" exitCode=0 Mar 20 11:10:49 crc kubenswrapper[4695]: I0320 11:10:49.743719 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerDied","Data":"e51702a5debd2693bae41fd82d4dbf4b49d1797063f6dc225112c2a1223b68a2"} Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.031691 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-qzgz2"] Mar 20 11:10:50 crc kubenswrapper[4695]: E0320 11:10:50.032177 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="registry-server" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.032207 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="registry-server" Mar 20 11:10:50 crc kubenswrapper[4695]: E0320 11:10:50.032228 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="extract-content" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.032236 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="extract-content" Mar 20 11:10:50 crc kubenswrapper[4695]: E0320 11:10:50.032266 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="extract-utilities" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.032273 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="extract-utilities" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.032446 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="acf5296f-8406-4d9b-be44-99fd0970895a" containerName="registry-server" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.033283 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.036041 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.036042 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-tzps9" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.036297 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.050624 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qzgz2"] Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.223571 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/4cca0836-531a-4234-90d7-bf912a4736ad-kube-api-access-fvmgn\") pod \"openstack-operator-index-qzgz2\" (UID: \"4cca0836-531a-4234-90d7-bf912a4736ad\") " pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.324821 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/4cca0836-531a-4234-90d7-bf912a4736ad-kube-api-access-fvmgn\") pod \"openstack-operator-index-qzgz2\" (UID: \"4cca0836-531a-4234-90d7-bf912a4736ad\") " pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.345595 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/4cca0836-531a-4234-90d7-bf912a4736ad-kube-api-access-fvmgn\") pod \"openstack-operator-index-qzgz2\" (UID: \"4cca0836-531a-4234-90d7-bf912a4736ad\") " pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.357521 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.796681 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"a7d268a3c90b9a00411aa884b9055a3b73443db0136165de6af865bd7b611033"} Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.797226 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"aac3d995a2ad6cbaa1c0a49204a9094a46a9b523e52bb40dbbc4b938a2f664cb"} Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.797248 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"1bd10fbc4e132c8ba4f0c7b038d05edca35ef95c24d12f8994279ecc8859aa1a"} Mar 20 11:10:50 crc kubenswrapper[4695]: I0320 11:10:50.797260 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"aafb11b164caeb14a06f7ed832e76231564c8f1fd66dc0fa29f6a91835698e1f"} Mar 20 11:10:51 crc kubenswrapper[4695]: I0320 11:10:51.098685 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-qzgz2"] Mar 20 11:10:51 crc kubenswrapper[4695]: I0320 11:10:51.810055 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"fb997a457f33b968921f499aab0f5303daa7969449da7ccb322245ac3f0b160a"} Mar 20 11:10:51 crc kubenswrapper[4695]: I0320 11:10:51.810124 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-kkzsx" event={"ID":"0fbeb433-d5f6-4984-a464-2783df4ccd70","Type":"ContainerStarted","Data":"263349c8fc011c8cb73bafd28f72ffa617e3dce1fc424ec8430a097eb3b819d7"} Mar 20 11:10:51 crc kubenswrapper[4695]: I0320 11:10:51.810251 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:51 crc kubenswrapper[4695]: I0320 11:10:51.812844 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qzgz2" event={"ID":"4cca0836-531a-4234-90d7-bf912a4736ad","Type":"ContainerStarted","Data":"b9055a66f17870288645fbddd7916b5e889ad0c56febf0b889063a9a53491859"} Mar 20 11:10:51 crc kubenswrapper[4695]: I0320 11:10:51.842341 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-kkzsx" podStartSLOduration=5.737067179 podStartE2EDuration="16.842316602s" podCreationTimestamp="2026-03-20 11:10:35 +0000 UTC" firstStartedPulling="2026-03-20 11:10:35.811370252 +0000 UTC m=+1013.591975815" lastFinishedPulling="2026-03-20 11:10:46.916619675 +0000 UTC m=+1024.697225238" observedRunningTime="2026-03-20 11:10:51.839135808 +0000 UTC m=+1029.619741371" watchObservedRunningTime="2026-03-20 11:10:51.842316602 +0000 UTC m=+1029.622922165" Mar 20 11:10:53 crc kubenswrapper[4695]: I0320 11:10:53.412451 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qzgz2"] Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.012329 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-vqtzk"] Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.013288 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.029971 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vqtzk"] Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.066126 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxlcw\" (UniqueName: \"kubernetes.io/projected/c83f3723-7376-428e-b64a-3c081f0dab01-kube-api-access-hxlcw\") pod \"openstack-operator-index-vqtzk\" (UID: \"c83f3723-7376-428e-b64a-3c081f0dab01\") " pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.167166 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxlcw\" (UniqueName: \"kubernetes.io/projected/c83f3723-7376-428e-b64a-3c081f0dab01-kube-api-access-hxlcw\") pod \"openstack-operator-index-vqtzk\" (UID: \"c83f3723-7376-428e-b64a-3c081f0dab01\") " pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.206336 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxlcw\" (UniqueName: \"kubernetes.io/projected/c83f3723-7376-428e-b64a-3c081f0dab01-kube-api-access-hxlcw\") pod \"openstack-operator-index-vqtzk\" (UID: \"c83f3723-7376-428e-b64a-3c081f0dab01\") " pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.387054 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.865868 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-vqtzk"] Mar 20 11:10:54 crc kubenswrapper[4695]: W0320 11:10:54.875018 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83f3723_7376_428e_b64a_3c081f0dab01.slice/crio-fe127dc23a2be1de6073eff9bea87c100e56088900c681b2b4d7d892c9b5b554 WatchSource:0}: Error finding container fe127dc23a2be1de6073eff9bea87c100e56088900c681b2b4d7d892c9b5b554: Status 404 returned error can't find the container with id fe127dc23a2be1de6073eff9bea87c100e56088900c681b2b4d7d892c9b5b554 Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.929033 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vqtzk" event={"ID":"c83f3723-7376-428e-b64a-3c081f0dab01","Type":"ContainerStarted","Data":"fe127dc23a2be1de6073eff9bea87c100e56088900c681b2b4d7d892c9b5b554"} Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.931125 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qzgz2" event={"ID":"4cca0836-531a-4234-90d7-bf912a4736ad","Type":"ContainerStarted","Data":"2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3"} Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.931376 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-qzgz2" podUID="4cca0836-531a-4234-90d7-bf912a4736ad" containerName="registry-server" containerID="cri-o://2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3" gracePeriod=2 Mar 20 11:10:54 crc kubenswrapper[4695]: I0320 11:10:54.947448 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-qzgz2" podStartSLOduration=1.6496735089999999 podStartE2EDuration="4.947421058s" podCreationTimestamp="2026-03-20 11:10:50 +0000 UTC" firstStartedPulling="2026-03-20 11:10:51.11199893 +0000 UTC m=+1028.892604503" lastFinishedPulling="2026-03-20 11:10:54.409746489 +0000 UTC m=+1032.190352052" observedRunningTime="2026-03-20 11:10:54.94560804 +0000 UTC m=+1032.726213603" watchObservedRunningTime="2026-03-20 11:10:54.947421058 +0000 UTC m=+1032.728026631" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.306796 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.485436 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/4cca0836-531a-4234-90d7-bf912a4736ad-kube-api-access-fvmgn\") pod \"4cca0836-531a-4234-90d7-bf912a4736ad\" (UID: \"4cca0836-531a-4234-90d7-bf912a4736ad\") " Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.493191 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cca0836-531a-4234-90d7-bf912a4736ad-kube-api-access-fvmgn" (OuterVolumeSpecName: "kube-api-access-fvmgn") pod "4cca0836-531a-4234-90d7-bf912a4736ad" (UID: "4cca0836-531a-4234-90d7-bf912a4736ad"). InnerVolumeSpecName "kube-api-access-fvmgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.525828 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.569528 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.587471 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvmgn\" (UniqueName: \"kubernetes.io/projected/4cca0836-531a-4234-90d7-bf912a4736ad-kube-api-access-fvmgn\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.671643 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-clkw8" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.942499 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-vqtzk" event={"ID":"c83f3723-7376-428e-b64a-3c081f0dab01","Type":"ContainerStarted","Data":"053e13d372b618e17ae01bba5d77aeab1d87cbd4d25109db46a44ce7de925a2f"} Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.947607 4695 generic.go:334] "Generic (PLEG): container finished" podID="4cca0836-531a-4234-90d7-bf912a4736ad" containerID="2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3" exitCode=0 Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.947710 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-qzgz2" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.947763 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qzgz2" event={"ID":"4cca0836-531a-4234-90d7-bf912a4736ad","Type":"ContainerDied","Data":"2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3"} Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.947813 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-qzgz2" event={"ID":"4cca0836-531a-4234-90d7-bf912a4736ad","Type":"ContainerDied","Data":"b9055a66f17870288645fbddd7916b5e889ad0c56febf0b889063a9a53491859"} Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.947838 4695 scope.go:117] "RemoveContainer" containerID="2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.978245 4695 scope.go:117] "RemoveContainer" containerID="2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3" Mar 20 11:10:55 crc kubenswrapper[4695]: E0320 11:10:55.979246 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3\": container with ID starting with 2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3 not found: ID does not exist" containerID="2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.979320 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3"} err="failed to get container status \"2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3\": rpc error: code = NotFound desc = could not find container \"2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3\": container with ID starting with 2e8615ba63a8131ebd8d2ea59a7502bf98045a2777b1c38bf48ce854abeae3a3 not found: ID does not exist" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.980031 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-vqtzk" podStartSLOduration=2.92118795 podStartE2EDuration="2.980000119s" podCreationTimestamp="2026-03-20 11:10:53 +0000 UTC" firstStartedPulling="2026-03-20 11:10:54.880821244 +0000 UTC m=+1032.661426807" lastFinishedPulling="2026-03-20 11:10:54.939633413 +0000 UTC m=+1032.720238976" observedRunningTime="2026-03-20 11:10:55.972181873 +0000 UTC m=+1033.752787436" watchObservedRunningTime="2026-03-20 11:10:55.980000119 +0000 UTC m=+1033.760605682" Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.993838 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-qzgz2"] Mar 20 11:10:55 crc kubenswrapper[4695]: I0320 11:10:55.997183 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-qzgz2"] Mar 20 11:10:56 crc kubenswrapper[4695]: I0320 11:10:56.899616 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cca0836-531a-4234-90d7-bf912a4736ad" path="/var/lib/kubelet/pods/4cca0836-531a-4234-90d7-bf912a4736ad/volumes" Mar 20 11:11:04 crc kubenswrapper[4695]: I0320 11:11:04.388130 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:11:04 crc kubenswrapper[4695]: I0320 11:11:04.388991 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:11:04 crc kubenswrapper[4695]: I0320 11:11:04.422656 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:11:05 crc kubenswrapper[4695]: I0320 11:11:05.047886 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-vqtzk" Mar 20 11:11:05 crc kubenswrapper[4695]: I0320 11:11:05.492025 4695 scope.go:117] "RemoveContainer" containerID="a8038cd44836d910d0a4138dedd63290d21948cd96d3228b49de0b271f34177b" Mar 20 11:11:05 crc kubenswrapper[4695]: I0320 11:11:05.530404 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-kkzsx" Mar 20 11:11:05 crc kubenswrapper[4695]: I0320 11:11:05.553695 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-gjhk4" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.611484 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vmktl"] Mar 20 11:11:09 crc kubenswrapper[4695]: E0320 11:11:09.612249 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cca0836-531a-4234-90d7-bf912a4736ad" containerName="registry-server" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.612587 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cca0836-531a-4234-90d7-bf912a4736ad" containerName="registry-server" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.612744 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cca0836-531a-4234-90d7-bf912a4736ad" containerName="registry-server" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.613820 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.628471 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmktl"] Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.749858 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrtgb\" (UniqueName: \"kubernetes.io/projected/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-kube-api-access-nrtgb\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.750004 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-utilities\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.750046 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-catalog-content\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.851240 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-catalog-content\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.851362 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrtgb\" (UniqueName: \"kubernetes.io/projected/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-kube-api-access-nrtgb\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.851396 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-utilities\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.851963 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-utilities\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.851986 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-catalog-content\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.875559 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrtgb\" (UniqueName: \"kubernetes.io/projected/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-kube-api-access-nrtgb\") pod \"redhat-marketplace-vmktl\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:09 crc kubenswrapper[4695]: I0320 11:11:09.937065 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:10 crc kubenswrapper[4695]: I0320 11:11:10.472309 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmktl"] Mar 20 11:11:10 crc kubenswrapper[4695]: W0320 11:11:10.478747 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb51b4f14_2f36_4cbe_a7cc_6d34c44a580b.slice/crio-0790b5800882984c937e403a106b526d8eb68095f0cc0ca246eb05e24fa5d4fa WatchSource:0}: Error finding container 0790b5800882984c937e403a106b526d8eb68095f0cc0ca246eb05e24fa5d4fa: Status 404 returned error can't find the container with id 0790b5800882984c937e403a106b526d8eb68095f0cc0ca246eb05e24fa5d4fa Mar 20 11:11:11 crc kubenswrapper[4695]: I0320 11:11:11.057884 4695 generic.go:334] "Generic (PLEG): container finished" podID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerID="9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a" exitCode=0 Mar 20 11:11:11 crc kubenswrapper[4695]: I0320 11:11:11.057963 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmktl" event={"ID":"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b","Type":"ContainerDied","Data":"9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a"} Mar 20 11:11:11 crc kubenswrapper[4695]: I0320 11:11:11.058435 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmktl" event={"ID":"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b","Type":"ContainerStarted","Data":"0790b5800882984c937e403a106b526d8eb68095f0cc0ca246eb05e24fa5d4fa"} Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.030715 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn"] Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.032452 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.037492 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-rnbk8" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.039556 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn"] Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.091864 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.092344 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v64wk\" (UniqueName: \"kubernetes.io/projected/343c2a1f-026f-4fd8-811f-790a957c3c82-kube-api-access-v64wk\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.092379 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.193802 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v64wk\" (UniqueName: \"kubernetes.io/projected/343c2a1f-026f-4fd8-811f-790a957c3c82-kube-api-access-v64wk\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.193886 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.193984 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.194531 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.194755 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.227875 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v64wk\" (UniqueName: \"kubernetes.io/projected/343c2a1f-026f-4fd8-811f-790a957c3c82-kube-api-access-v64wk\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:12 crc kubenswrapper[4695]: I0320 11:11:12.358324 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:13 crc kubenswrapper[4695]: I0320 11:11:13.055439 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn"] Mar 20 11:11:13 crc kubenswrapper[4695]: I0320 11:11:13.082318 4695 generic.go:334] "Generic (PLEG): container finished" podID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerID="0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca" exitCode=0 Mar 20 11:11:13 crc kubenswrapper[4695]: I0320 11:11:13.082483 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmktl" event={"ID":"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b","Type":"ContainerDied","Data":"0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca"} Mar 20 11:11:13 crc kubenswrapper[4695]: I0320 11:11:13.085173 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" event={"ID":"343c2a1f-026f-4fd8-811f-790a957c3c82","Type":"ContainerStarted","Data":"2c119bb663a60170ee30f4476d8c8d00275e8a2fcc6fdc81f187bfd5d95abe3e"} Mar 20 11:11:14 crc kubenswrapper[4695]: I0320 11:11:14.097287 4695 generic.go:334] "Generic (PLEG): container finished" podID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerID="e5fbb81ce33ce27159d9921f3820a23b61a0581186a4b1122848cc69d3cf033e" exitCode=0 Mar 20 11:11:14 crc kubenswrapper[4695]: I0320 11:11:14.097406 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" event={"ID":"343c2a1f-026f-4fd8-811f-790a957c3c82","Type":"ContainerDied","Data":"e5fbb81ce33ce27159d9921f3820a23b61a0581186a4b1122848cc69d3cf033e"} Mar 20 11:11:14 crc kubenswrapper[4695]: I0320 11:11:14.102047 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmktl" event={"ID":"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b","Type":"ContainerStarted","Data":"618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea"} Mar 20 11:11:14 crc kubenswrapper[4695]: I0320 11:11:14.144750 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vmktl" podStartSLOduration=2.575222263 podStartE2EDuration="5.144721696s" podCreationTimestamp="2026-03-20 11:11:09 +0000 UTC" firstStartedPulling="2026-03-20 11:11:11.059685518 +0000 UTC m=+1048.840291081" lastFinishedPulling="2026-03-20 11:11:13.629184951 +0000 UTC m=+1051.409790514" observedRunningTime="2026-03-20 11:11:14.142005565 +0000 UTC m=+1051.922611138" watchObservedRunningTime="2026-03-20 11:11:14.144721696 +0000 UTC m=+1051.925327259" Mar 20 11:11:15 crc kubenswrapper[4695]: I0320 11:11:15.109802 4695 generic.go:334] "Generic (PLEG): container finished" podID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerID="5d1a8027e0fa782de49bd140e9eee0571051fcdc571c433b55d9a71eda1bce77" exitCode=0 Mar 20 11:11:15 crc kubenswrapper[4695]: I0320 11:11:15.109991 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" event={"ID":"343c2a1f-026f-4fd8-811f-790a957c3c82","Type":"ContainerDied","Data":"5d1a8027e0fa782de49bd140e9eee0571051fcdc571c433b55d9a71eda1bce77"} Mar 20 11:11:16 crc kubenswrapper[4695]: I0320 11:11:16.121648 4695 generic.go:334] "Generic (PLEG): container finished" podID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerID="120b69e971a2f7cd8e8dae60406aae605dd1645fb4101e9ed4168acbd35c39e6" exitCode=0 Mar 20 11:11:16 crc kubenswrapper[4695]: I0320 11:11:16.121730 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" event={"ID":"343c2a1f-026f-4fd8-811f-790a957c3c82","Type":"ContainerDied","Data":"120b69e971a2f7cd8e8dae60406aae605dd1645fb4101e9ed4168acbd35c39e6"} Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.403435 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.585486 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v64wk\" (UniqueName: \"kubernetes.io/projected/343c2a1f-026f-4fd8-811f-790a957c3c82-kube-api-access-v64wk\") pod \"343c2a1f-026f-4fd8-811f-790a957c3c82\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.585673 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-util\") pod \"343c2a1f-026f-4fd8-811f-790a957c3c82\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.585735 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-bundle\") pod \"343c2a1f-026f-4fd8-811f-790a957c3c82\" (UID: \"343c2a1f-026f-4fd8-811f-790a957c3c82\") " Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.586655 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-bundle" (OuterVolumeSpecName: "bundle") pod "343c2a1f-026f-4fd8-811f-790a957c3c82" (UID: "343c2a1f-026f-4fd8-811f-790a957c3c82"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.592269 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343c2a1f-026f-4fd8-811f-790a957c3c82-kube-api-access-v64wk" (OuterVolumeSpecName: "kube-api-access-v64wk") pod "343c2a1f-026f-4fd8-811f-790a957c3c82" (UID: "343c2a1f-026f-4fd8-811f-790a957c3c82"). InnerVolumeSpecName "kube-api-access-v64wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.604429 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-util" (OuterVolumeSpecName: "util") pod "343c2a1f-026f-4fd8-811f-790a957c3c82" (UID: "343c2a1f-026f-4fd8-811f-790a957c3c82"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.687066 4695 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.687125 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v64wk\" (UniqueName: \"kubernetes.io/projected/343c2a1f-026f-4fd8-811f-790a957c3c82-kube-api-access-v64wk\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:17 crc kubenswrapper[4695]: I0320 11:11:17.687137 4695 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/343c2a1f-026f-4fd8-811f-790a957c3c82-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:18 crc kubenswrapper[4695]: I0320 11:11:18.138339 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" event={"ID":"343c2a1f-026f-4fd8-811f-790a957c3c82","Type":"ContainerDied","Data":"2c119bb663a60170ee30f4476d8c8d00275e8a2fcc6fdc81f187bfd5d95abe3e"} Mar 20 11:11:18 crc kubenswrapper[4695]: I0320 11:11:18.138399 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c119bb663a60170ee30f4476d8c8d00275e8a2fcc6fdc81f187bfd5d95abe3e" Mar 20 11:11:18 crc kubenswrapper[4695]: I0320 11:11:18.138470 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn" Mar 20 11:11:19 crc kubenswrapper[4695]: I0320 11:11:19.937193 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:19 crc kubenswrapper[4695]: I0320 11:11:19.937727 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:19 crc kubenswrapper[4695]: I0320 11:11:19.983273 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:20 crc kubenswrapper[4695]: I0320 11:11:20.213177 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.474372 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq"] Mar 20 11:11:21 crc kubenswrapper[4695]: E0320 11:11:21.475176 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="util" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.475199 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="util" Mar 20 11:11:21 crc kubenswrapper[4695]: E0320 11:11:21.475214 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="pull" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.475220 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="pull" Mar 20 11:11:21 crc kubenswrapper[4695]: E0320 11:11:21.475233 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="extract" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.475239 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="extract" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.475368 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="343c2a1f-026f-4fd8-811f-790a957c3c82" containerName="extract" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.476037 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.478442 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-qzsjs" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.500405 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq"] Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.650013 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5kzd\" (UniqueName: \"kubernetes.io/projected/b8023be3-6ce2-4167-bda0-378d061db8ac-kube-api-access-r5kzd\") pod \"openstack-operator-controller-init-846ffbb776-b2zpq\" (UID: \"b8023be3-6ce2-4167-bda0-378d061db8ac\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.751333 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5kzd\" (UniqueName: \"kubernetes.io/projected/b8023be3-6ce2-4167-bda0-378d061db8ac-kube-api-access-r5kzd\") pod \"openstack-operator-controller-init-846ffbb776-b2zpq\" (UID: \"b8023be3-6ce2-4167-bda0-378d061db8ac\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.773938 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5kzd\" (UniqueName: \"kubernetes.io/projected/b8023be3-6ce2-4167-bda0-378d061db8ac-kube-api-access-r5kzd\") pod \"openstack-operator-controller-init-846ffbb776-b2zpq\" (UID: \"b8023be3-6ce2-4167-bda0-378d061db8ac\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:11:21 crc kubenswrapper[4695]: I0320 11:11:21.803160 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.225690 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq"] Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.375836 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmktl"] Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.376679 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vmktl" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="registry-server" containerID="cri-o://618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea" gracePeriod=2 Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.891644 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.977322 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-catalog-content\") pod \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.977466 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrtgb\" (UniqueName: \"kubernetes.io/projected/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-kube-api-access-nrtgb\") pod \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.978538 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-utilities\") pod \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\" (UID: \"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b\") " Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.979499 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-utilities" (OuterVolumeSpecName: "utilities") pod "b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" (UID: "b51b4f14-2f36-4cbe-a7cc-6d34c44a580b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:22 crc kubenswrapper[4695]: I0320 11:11:22.996246 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-kube-api-access-nrtgb" (OuterVolumeSpecName: "kube-api-access-nrtgb") pod "b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" (UID: "b51b4f14-2f36-4cbe-a7cc-6d34c44a580b"). InnerVolumeSpecName "kube-api-access-nrtgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.011518 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" (UID: "b51b4f14-2f36-4cbe-a7cc-6d34c44a580b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.080869 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.080931 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.080949 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrtgb\" (UniqueName: \"kubernetes.io/projected/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b-kube-api-access-nrtgb\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.199951 4695 generic.go:334] "Generic (PLEG): container finished" podID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerID="618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea" exitCode=0 Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.200044 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vmktl" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.200031 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmktl" event={"ID":"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b","Type":"ContainerDied","Data":"618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea"} Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.200123 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vmktl" event={"ID":"b51b4f14-2f36-4cbe-a7cc-6d34c44a580b","Type":"ContainerDied","Data":"0790b5800882984c937e403a106b526d8eb68095f0cc0ca246eb05e24fa5d4fa"} Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.200153 4695 scope.go:117] "RemoveContainer" containerID="618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.201752 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" event={"ID":"b8023be3-6ce2-4167-bda0-378d061db8ac","Type":"ContainerStarted","Data":"c7bae8d2270a3ac00813a648a10bf0ceff929bd109b50ccb03f1597015afa877"} Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.239725 4695 scope.go:117] "RemoveContainer" containerID="0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.244704 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmktl"] Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.249793 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vmktl"] Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.258412 4695 scope.go:117] "RemoveContainer" containerID="9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.287652 4695 scope.go:117] "RemoveContainer" containerID="618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea" Mar 20 11:11:23 crc kubenswrapper[4695]: E0320 11:11:23.288496 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea\": container with ID starting with 618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea not found: ID does not exist" containerID="618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.288605 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea"} err="failed to get container status \"618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea\": rpc error: code = NotFound desc = could not find container \"618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea\": container with ID starting with 618d31bc8f0274c677cd7d2d472a426637107e6492689ed35f015b316c251dea not found: ID does not exist" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.288645 4695 scope.go:117] "RemoveContainer" containerID="0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca" Mar 20 11:11:23 crc kubenswrapper[4695]: E0320 11:11:23.290825 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca\": container with ID starting with 0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca not found: ID does not exist" containerID="0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.290871 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca"} err="failed to get container status \"0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca\": rpc error: code = NotFound desc = could not find container \"0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca\": container with ID starting with 0dc768e43c1b74a43c18694ceab6578a85e4940b20cbc062074f8c93202d5eca not found: ID does not exist" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.290902 4695 scope.go:117] "RemoveContainer" containerID="9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a" Mar 20 11:11:23 crc kubenswrapper[4695]: E0320 11:11:23.291445 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a\": container with ID starting with 9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a not found: ID does not exist" containerID="9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a" Mar 20 11:11:23 crc kubenswrapper[4695]: I0320 11:11:23.291482 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a"} err="failed to get container status \"9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a\": rpc error: code = NotFound desc = could not find container \"9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a\": container with ID starting with 9cb04c50f54124a8749303a6a6f51b19974f75cbd2fca5288c5ad679cae2f74a not found: ID does not exist" Mar 20 11:11:24 crc kubenswrapper[4695]: I0320 11:11:24.920391 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" path="/var/lib/kubelet/pods/b51b4f14-2f36-4cbe-a7cc-6d34c44a580b/volumes" Mar 20 11:11:28 crc kubenswrapper[4695]: I0320 11:11:28.250337 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" event={"ID":"b8023be3-6ce2-4167-bda0-378d061db8ac","Type":"ContainerStarted","Data":"b723b021395864b29e07beb583d7873d46a0f02e4a4680500a009d378ffab240"} Mar 20 11:11:28 crc kubenswrapper[4695]: I0320 11:11:28.250980 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:11:28 crc kubenswrapper[4695]: I0320 11:11:28.283652 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" podStartSLOduration=1.6250332250000001 podStartE2EDuration="7.283626993s" podCreationTimestamp="2026-03-20 11:11:21 +0000 UTC" firstStartedPulling="2026-03-20 11:11:22.221177501 +0000 UTC m=+1060.001783064" lastFinishedPulling="2026-03-20 11:11:27.879771269 +0000 UTC m=+1065.660376832" observedRunningTime="2026-03-20 11:11:28.279526995 +0000 UTC m=+1066.060132568" watchObservedRunningTime="2026-03-20 11:11:28.283626993 +0000 UTC m=+1066.064232556" Mar 20 11:11:41 crc kubenswrapper[4695]: I0320 11:11:41.826620 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-b2zpq" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.139616 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566752-ws9vl"] Mar 20 11:12:00 crc kubenswrapper[4695]: E0320 11:12:00.140556 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.140571 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4695]: E0320 11:12:00.140584 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.140591 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4695]: E0320 11:12:00.140603 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.140611 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.140712 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51b4f14-2f36-4cbe-a7cc-6d34c44a580b" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.141215 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.144028 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.145474 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.146557 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.167397 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-ws9vl"] Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.217434 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qhvj\" (UniqueName: \"kubernetes.io/projected/df3aebe2-0698-4648-ac7c-eae261c6f8c1-kube-api-access-2qhvj\") pod \"auto-csr-approver-29566752-ws9vl\" (UID: \"df3aebe2-0698-4648-ac7c-eae261c6f8c1\") " pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.318932 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qhvj\" (UniqueName: \"kubernetes.io/projected/df3aebe2-0698-4648-ac7c-eae261c6f8c1-kube-api-access-2qhvj\") pod \"auto-csr-approver-29566752-ws9vl\" (UID: \"df3aebe2-0698-4648-ac7c-eae261c6f8c1\") " pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.345209 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qhvj\" (UniqueName: \"kubernetes.io/projected/df3aebe2-0698-4648-ac7c-eae261c6f8c1-kube-api-access-2qhvj\") pod \"auto-csr-approver-29566752-ws9vl\" (UID: \"df3aebe2-0698-4648-ac7c-eae261c6f8c1\") " pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.462760 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:00 crc kubenswrapper[4695]: I0320 11:12:00.777377 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-ws9vl"] Mar 20 11:12:01 crc kubenswrapper[4695]: I0320 11:12:01.666821 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" event={"ID":"df3aebe2-0698-4648-ac7c-eae261c6f8c1","Type":"ContainerStarted","Data":"50c4e3f1f03768a28d1936b8ecf058b9574f502bef7cbad5f9bfac8e5c903e66"} Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.315480 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.317527 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.320161 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-98446" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.325218 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.329998 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.332193 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.333472 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-22pt6" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.347118 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5t427"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.348435 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.351715 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.393346 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-gwpz2" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.416850 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5t427"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.431034 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.432552 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.439805 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-cqxn4" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.447229 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.448480 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.451651 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nhxpn" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.454066 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dflc4\" (UniqueName: \"kubernetes.io/projected/c4f4cf94-cced-45aa-9d30-2a60e6a9e291-kube-api-access-dflc4\") pod \"barbican-operator-controller-manager-59bc569d95-l29bw\" (UID: \"c4f4cf94-cced-45aa-9d30-2a60e6a9e291\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.454168 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtk95\" (UniqueName: \"kubernetes.io/projected/a6a059b0-61e0-4592-8661-e480f9573c66-kube-api-access-jtk95\") pod \"cinder-operator-controller-manager-8d58dc466-g8s68\" (UID: \"a6a059b0-61e0-4592-8661-e480f9573c66\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.454203 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx6cp\" (UniqueName: \"kubernetes.io/projected/58250fd6-7e5e-429d-907a-c0f2725f029f-kube-api-access-xx6cp\") pod \"designate-operator-controller-manager-588d4d986b-5t427\" (UID: \"58250fd6-7e5e-429d-907a-c0f2725f029f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.459248 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.480882 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.495954 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.497094 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.501980 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rcs7b" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.505342 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.506424 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.511977 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-9rpxp" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.512746 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.520834 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.528946 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.536738 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.537714 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.546833 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-cpq4z" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555366 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbfwn\" (UniqueName: \"kubernetes.io/projected/d46fd923-64a9-48cf-b3ea-05d6a676d7e1-kube-api-access-vbfwn\") pod \"glance-operator-controller-manager-79df6bcc97-cg8tn\" (UID: \"d46fd923-64a9-48cf-b3ea-05d6a676d7e1\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555475 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftrm6\" (UniqueName: \"kubernetes.io/projected/99970094-eb53-4489-ba1d-1f650470c848-kube-api-access-ftrm6\") pod \"heat-operator-controller-manager-67dd5f86f5-sjz6g\" (UID: \"99970094-eb53-4489-ba1d-1f650470c848\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555507 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555542 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dflc4\" (UniqueName: \"kubernetes.io/projected/c4f4cf94-cced-45aa-9d30-2a60e6a9e291-kube-api-access-dflc4\") pod \"barbican-operator-controller-manager-59bc569d95-l29bw\" (UID: \"c4f4cf94-cced-45aa-9d30-2a60e6a9e291\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555587 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44bw\" (UniqueName: \"kubernetes.io/projected/bef240d4-6041-44e9-8228-f707a5f2f8eb-kube-api-access-l44bw\") pod \"horizon-operator-controller-manager-8464cc45fb-2l9vk\" (UID: \"bef240d4-6041-44e9-8228-f707a5f2f8eb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555632 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtk95\" (UniqueName: \"kubernetes.io/projected/a6a059b0-61e0-4592-8661-e480f9573c66-kube-api-access-jtk95\") pod \"cinder-operator-controller-manager-8d58dc466-g8s68\" (UID: \"a6a059b0-61e0-4592-8661-e480f9573c66\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555661 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx6cp\" (UniqueName: \"kubernetes.io/projected/58250fd6-7e5e-429d-907a-c0f2725f029f-kube-api-access-xx6cp\") pod \"designate-operator-controller-manager-588d4d986b-5t427\" (UID: \"58250fd6-7e5e-429d-907a-c0f2725f029f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.555688 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nntjn\" (UniqueName: \"kubernetes.io/projected/fd90b802-5cbc-4d48-a76a-2903fab33ef0-kube-api-access-nntjn\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.583998 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.620995 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.622237 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.629614 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-bh7xf" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.631588 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtk95\" (UniqueName: \"kubernetes.io/projected/a6a059b0-61e0-4592-8661-e480f9573c66-kube-api-access-jtk95\") pod \"cinder-operator-controller-manager-8d58dc466-g8s68\" (UID: \"a6a059b0-61e0-4592-8661-e480f9573c66\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.633795 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx6cp\" (UniqueName: \"kubernetes.io/projected/58250fd6-7e5e-429d-907a-c0f2725f029f-kube-api-access-xx6cp\") pod \"designate-operator-controller-manager-588d4d986b-5t427\" (UID: \"58250fd6-7e5e-429d-907a-c0f2725f029f\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.641308 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.653297 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dflc4\" (UniqueName: \"kubernetes.io/projected/c4f4cf94-cced-45aa-9d30-2a60e6a9e291-kube-api-access-dflc4\") pod \"barbican-operator-controller-manager-59bc569d95-l29bw\" (UID: \"c4f4cf94-cced-45aa-9d30-2a60e6a9e291\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.653713 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.658707 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.663678 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l44bw\" (UniqueName: \"kubernetes.io/projected/bef240d4-6041-44e9-8228-f707a5f2f8eb-kube-api-access-l44bw\") pod \"horizon-operator-controller-manager-8464cc45fb-2l9vk\" (UID: \"bef240d4-6041-44e9-8228-f707a5f2f8eb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.663769 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nntjn\" (UniqueName: \"kubernetes.io/projected/fd90b802-5cbc-4d48-a76a-2903fab33ef0-kube-api-access-nntjn\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.663824 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjt9z\" (UniqueName: \"kubernetes.io/projected/9a8f730f-c9f3-4467-8b90-cfddd028ee71-kube-api-access-sjt9z\") pod \"ironic-operator-controller-manager-6f787dddc9-ws2ch\" (UID: \"9a8f730f-c9f3-4467-8b90-cfddd028ee71\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.663858 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbfwn\" (UniqueName: \"kubernetes.io/projected/d46fd923-64a9-48cf-b3ea-05d6a676d7e1-kube-api-access-vbfwn\") pod \"glance-operator-controller-manager-79df6bcc97-cg8tn\" (UID: \"d46fd923-64a9-48cf-b3ea-05d6a676d7e1\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.663898 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftrm6\" (UniqueName: \"kubernetes.io/projected/99970094-eb53-4489-ba1d-1f650470c848-kube-api-access-ftrm6\") pod \"heat-operator-controller-manager-67dd5f86f5-sjz6g\" (UID: \"99970094-eb53-4489-ba1d-1f650470c848\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.663938 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:02 crc kubenswrapper[4695]: E0320 11:12:02.664092 4695 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:02 crc kubenswrapper[4695]: E0320 11:12:02.664151 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert podName:fd90b802-5cbc-4d48-a76a-2903fab33ef0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:03.164129766 +0000 UTC m=+1100.944735329 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert") pod "infra-operator-controller-manager-669fff9c7c-54mmr" (UID: "fd90b802-5cbc-4d48-a76a-2903fab33ef0") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.671785 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-w62g8"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.673018 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.675550 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.676736 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6c4tm" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.713665 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.714831 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.715929 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nntjn\" (UniqueName: \"kubernetes.io/projected/fd90b802-5cbc-4d48-a76a-2903fab33ef0-kube-api-access-nntjn\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.717075 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-w62g8"] Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.719037 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44bw\" (UniqueName: \"kubernetes.io/projected/bef240d4-6041-44e9-8228-f707a5f2f8eb-kube-api-access-l44bw\") pod \"horizon-operator-controller-manager-8464cc45fb-2l9vk\" (UID: \"bef240d4-6041-44e9-8228-f707a5f2f8eb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.724724 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-pdw2k" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.726871 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" event={"ID":"df3aebe2-0698-4648-ac7c-eae261c6f8c1","Type":"ContainerStarted","Data":"6d18f6f009b8b5cc1f2e51458fdd68cd5989eebbe5be8f69feb25f4cbb4809dc"} Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.732177 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbfwn\" (UniqueName: \"kubernetes.io/projected/d46fd923-64a9-48cf-b3ea-05d6a676d7e1-kube-api-access-vbfwn\") pod \"glance-operator-controller-manager-79df6bcc97-cg8tn\" (UID: \"d46fd923-64a9-48cf-b3ea-05d6a676d7e1\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:12:02 crc kubenswrapper[4695]: I0320 11:12:02.736901 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftrm6\" (UniqueName: \"kubernetes.io/projected/99970094-eb53-4489-ba1d-1f650470c848-kube-api-access-ftrm6\") pod \"heat-operator-controller-manager-67dd5f86f5-sjz6g\" (UID: \"99970094-eb53-4489-ba1d-1f650470c848\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.751996 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-z62m6"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.753217 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.757513 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-xzz7c" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.765359 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xddgv\" (UniqueName: \"kubernetes.io/projected/5ce3aca1-15ad-43a5-be8f-b7c5580fcb59-kube-api-access-xddgv\") pod \"manila-operator-controller-manager-55f864c847-w62g8\" (UID: \"5ce3aca1-15ad-43a5-be8f-b7c5580fcb59\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.765436 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjt9z\" (UniqueName: \"kubernetes.io/projected/9a8f730f-c9f3-4467-8b90-cfddd028ee71-kube-api-access-sjt9z\") pod \"ironic-operator-controller-manager-6f787dddc9-ws2ch\" (UID: \"9a8f730f-c9f3-4467-8b90-cfddd028ee71\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.765502 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhjgh\" (UniqueName: \"kubernetes.io/projected/39cbf988-66c1-4ac9-9595-3cf263cde0aa-kube-api-access-rhjgh\") pod \"keystone-operator-controller-manager-768b96df4c-tcghf\" (UID: \"39cbf988-66c1-4ac9-9595-3cf263cde0aa\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:02.765564 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfmj8\" (UniqueName: \"kubernetes.io/projected/7e6a711c-3208-459c-9f80-f29d5bbd0177-kube-api-access-bfmj8\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8kxp\" (UID: \"7e6a711c-3208-459c-9f80-f29d5bbd0177\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.072654 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.072976 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhjgh\" (UniqueName: \"kubernetes.io/projected/39cbf988-66c1-4ac9-9595-3cf263cde0aa-kube-api-access-rhjgh\") pod \"keystone-operator-controller-manager-768b96df4c-tcghf\" (UID: \"39cbf988-66c1-4ac9-9595-3cf263cde0aa\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.090883 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfmj8\" (UniqueName: \"kubernetes.io/projected/7e6a711c-3208-459c-9f80-f29d5bbd0177-kube-api-access-bfmj8\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8kxp\" (UID: \"7e6a711c-3208-459c-9f80-f29d5bbd0177\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.091181 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.091287 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xddgv\" (UniqueName: \"kubernetes.io/projected/5ce3aca1-15ad-43a5-be8f-b7c5580fcb59-kube-api-access-xddgv\") pod \"manila-operator-controller-manager-55f864c847-w62g8\" (UID: \"5ce3aca1-15ad-43a5-be8f-b7c5580fcb59\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.122785 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.182040 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhjgh\" (UniqueName: \"kubernetes.io/projected/39cbf988-66c1-4ac9-9595-3cf263cde0aa-kube-api-access-rhjgh\") pod \"keystone-operator-controller-manager-768b96df4c-tcghf\" (UID: \"39cbf988-66c1-4ac9-9595-3cf263cde0aa\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.183984 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjt9z\" (UniqueName: \"kubernetes.io/projected/9a8f730f-c9f3-4467-8b90-cfddd028ee71-kube-api-access-sjt9z\") pod \"ironic-operator-controller-manager-6f787dddc9-ws2ch\" (UID: \"9a8f730f-c9f3-4467-8b90-cfddd028ee71\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.188419 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfmj8\" (UniqueName: \"kubernetes.io/projected/7e6a711c-3208-459c-9f80-f29d5bbd0177-kube-api-access-bfmj8\") pod \"mariadb-operator-controller-manager-67ccfc9778-m8kxp\" (UID: \"7e6a711c-3208-459c-9f80-f29d5bbd0177\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.190007 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xddgv\" (UniqueName: \"kubernetes.io/projected/5ce3aca1-15ad-43a5-be8f-b7c5580fcb59-kube-api-access-xddgv\") pod \"manila-operator-controller-manager-55f864c847-w62g8\" (UID: \"5ce3aca1-15ad-43a5-be8f-b7c5580fcb59\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.191169 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.194508 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7jct\" (UniqueName: \"kubernetes.io/projected/d2a6f843-5ef0-48d2-9582-4c56551531a9-kube-api-access-n7jct\") pod \"neutron-operator-controller-manager-767865f676-z62m6\" (UID: \"d2a6f843-5ef0-48d2-9582-4c56551531a9\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.194611 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.194798 4695 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.194877 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert podName:fd90b802-5cbc-4d48-a76a-2903fab33ef0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:04.194848451 +0000 UTC m=+1101.975454014 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert") pod "infra-operator-controller-manager-669fff9c7c-54mmr" (UID: "fd90b802-5cbc-4d48-a76a-2903fab33ef0") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.246378 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.246436 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.247732 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-z62m6"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.247777 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.248723 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.248778 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.248817 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.249821 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.250535 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.250826 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.252013 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.253753 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.258436 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.269395 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.270875 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.277254 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.278521 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.291388 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301157 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm6jh\" (UniqueName: \"kubernetes.io/projected/26fc4733-51dc-4a8d-ba8e-03bd966cac17-kube-api-access-wm6jh\") pod \"octavia-operator-controller-manager-5b9f45d989-4q99q\" (UID: \"26fc4733-51dc-4a8d-ba8e-03bd966cac17\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301249 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx7sl\" (UniqueName: \"kubernetes.io/projected/16ad72ba-9b7f-47fc-8216-147e439de734-kube-api-access-rx7sl\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301281 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4cqx\" (UniqueName: \"kubernetes.io/projected/9ab1dbad-3ea8-4ed7-9284-d27f1516c26c-kube-api-access-v4cqx\") pod \"nova-operator-controller-manager-5d488d59fb-zcgwb\" (UID: \"9ab1dbad-3ea8-4ed7-9284-d27f1516c26c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301311 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7jct\" (UniqueName: \"kubernetes.io/projected/d2a6f843-5ef0-48d2-9582-4c56551531a9-kube-api-access-n7jct\") pod \"neutron-operator-controller-manager-767865f676-z62m6\" (UID: \"d2a6f843-5ef0-48d2-9582-4c56551531a9\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301351 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n7mh\" (UniqueName: \"kubernetes.io/projected/a7f91c18-4219-4040-b474-4d38d377071a-kube-api-access-5n7mh\") pod \"swift-operator-controller-manager-c674c5965-zs6kl\" (UID: \"a7f91c18-4219-4040-b474-4d38d377071a\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301379 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dmbk\" (UniqueName: \"kubernetes.io/projected/4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a-kube-api-access-5dmbk\") pod \"ovn-operator-controller-manager-884679f54-lnj2n\" (UID: \"4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.301434 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.308039 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.319740 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-bc9ms" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.320159 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.320388 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wcnzn" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.320643 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-mrbmr" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.320957 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jvx8d" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.321169 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ps5bc" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.321254 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-t8bfb" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.324715 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.325894 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.337623 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-7zfw9" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.339103 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.340615 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.343886 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7jct\" (UniqueName: \"kubernetes.io/projected/d2a6f843-5ef0-48d2-9582-4c56551531a9-kube-api-access-n7jct\") pod \"neutron-operator-controller-manager-767865f676-z62m6\" (UID: \"d2a6f843-5ef0-48d2-9582-4c56551531a9\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.346422 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.354497 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.355021 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-pkmff" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.365724 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.371106 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.374540 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-xlmv7" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.377791 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.381312 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.403314 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.419172 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420050 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm6jh\" (UniqueName: \"kubernetes.io/projected/26fc4733-51dc-4a8d-ba8e-03bd966cac17-kube-api-access-wm6jh\") pod \"octavia-operator-controller-manager-5b9f45d989-4q99q\" (UID: \"26fc4733-51dc-4a8d-ba8e-03bd966cac17\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420251 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zljj\" (UniqueName: \"kubernetes.io/projected/f4807ed5-3bee-42a3-a23b-e7473fc1b833-kube-api-access-8zljj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hvnnc\" (UID: \"f4807ed5-3bee-42a3-a23b-e7473fc1b833\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420311 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx7sl\" (UniqueName: \"kubernetes.io/projected/16ad72ba-9b7f-47fc-8216-147e439de734-kube-api-access-rx7sl\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420345 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4cqx\" (UniqueName: \"kubernetes.io/projected/9ab1dbad-3ea8-4ed7-9284-d27f1516c26c-kube-api-access-v4cqx\") pod \"nova-operator-controller-manager-5d488d59fb-zcgwb\" (UID: \"9ab1dbad-3ea8-4ed7-9284-d27f1516c26c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420467 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpmnz\" (UniqueName: \"kubernetes.io/projected/0b97d7e0-8c0d-425f-927c-cdf926f3b9fb-kube-api-access-kpmnz\") pod \"placement-operator-controller-manager-5784578c99-hxjpw\" (UID: \"0b97d7e0-8c0d-425f-927c-cdf926f3b9fb\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420507 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n7mh\" (UniqueName: \"kubernetes.io/projected/a7f91c18-4219-4040-b474-4d38d377071a-kube-api-access-5n7mh\") pod \"swift-operator-controller-manager-c674c5965-zs6kl\" (UID: \"a7f91c18-4219-4040-b474-4d38d377071a\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420570 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnq47\" (UniqueName: \"kubernetes.io/projected/23e52d31-9d42-439d-95a1-1761dee30f57-kube-api-access-vnq47\") pod \"telemetry-operator-controller-manager-d6b694c5-qwltk\" (UID: \"23e52d31-9d42-439d-95a1-1761dee30f57\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420599 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dmbk\" (UniqueName: \"kubernetes.io/projected/4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a-kube-api-access-5dmbk\") pod \"ovn-operator-controller-manager-884679f54-lnj2n\" (UID: \"4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.420651 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k6g2\" (UniqueName: \"kubernetes.io/projected/498749d1-4031-4083-bb7b-e2640519795e-kube-api-access-6k6g2\") pod \"test-operator-controller-manager-5c5cb9c4d7-njjcr\" (UID: \"498749d1-4031-4083-bb7b-e2640519795e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.419489 4695 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.420866 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert podName:16ad72ba-9b7f-47fc-8216-147e439de734 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:03.920828942 +0000 UTC m=+1101.701434505 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" (UID: "16ad72ba-9b7f-47fc-8216-147e439de734") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.430973 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.443199 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.454700 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.455285 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.455841 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vcbgk" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.465823 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx7sl\" (UniqueName: \"kubernetes.io/projected/16ad72ba-9b7f-47fc-8216-147e439de734-kube-api-access-rx7sl\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.473402 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dmbk\" (UniqueName: \"kubernetes.io/projected/4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a-kube-api-access-5dmbk\") pod \"ovn-operator-controller-manager-884679f54-lnj2n\" (UID: \"4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.478460 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.480869 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n7mh\" (UniqueName: \"kubernetes.io/projected/a7f91c18-4219-4040-b474-4d38d377071a-kube-api-access-5n7mh\") pod \"swift-operator-controller-manager-c674c5965-zs6kl\" (UID: \"a7f91c18-4219-4040-b474-4d38d377071a\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.483021 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm6jh\" (UniqueName: \"kubernetes.io/projected/26fc4733-51dc-4a8d-ba8e-03bd966cac17-kube-api-access-wm6jh\") pod \"octavia-operator-controller-manager-5b9f45d989-4q99q\" (UID: \"26fc4733-51dc-4a8d-ba8e-03bd966cac17\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.487752 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.517444 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4cqx\" (UniqueName: \"kubernetes.io/projected/9ab1dbad-3ea8-4ed7-9284-d27f1516c26c-kube-api-access-v4cqx\") pod \"nova-operator-controller-manager-5d488d59fb-zcgwb\" (UID: \"9ab1dbad-3ea8-4ed7-9284-d27f1516c26c\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.528192 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529430 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpmnz\" (UniqueName: \"kubernetes.io/projected/0b97d7e0-8c0d-425f-927c-cdf926f3b9fb-kube-api-access-kpmnz\") pod \"placement-operator-controller-manager-5784578c99-hxjpw\" (UID: \"0b97d7e0-8c0d-425f-927c-cdf926f3b9fb\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529460 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnq47\" (UniqueName: \"kubernetes.io/projected/23e52d31-9d42-439d-95a1-1761dee30f57-kube-api-access-vnq47\") pod \"telemetry-operator-controller-manager-d6b694c5-qwltk\" (UID: \"23e52d31-9d42-439d-95a1-1761dee30f57\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529522 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z69g4\" (UniqueName: \"kubernetes.io/projected/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-kube-api-access-z69g4\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529547 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k6g2\" (UniqueName: \"kubernetes.io/projected/498749d1-4031-4083-bb7b-e2640519795e-kube-api-access-6k6g2\") pod \"test-operator-controller-manager-5c5cb9c4d7-njjcr\" (UID: \"498749d1-4031-4083-bb7b-e2640519795e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529585 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529659 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.529688 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zljj\" (UniqueName: \"kubernetes.io/projected/f4807ed5-3bee-42a3-a23b-e7473fc1b833-kube-api-access-8zljj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hvnnc\" (UID: \"f4807ed5-3bee-42a3-a23b-e7473fc1b833\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.585563 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2"] Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.634201 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" podStartSLOduration=2.554587882 podStartE2EDuration="3.63416309s" podCreationTimestamp="2026-03-20 11:12:00 +0000 UTC" firstStartedPulling="2026-03-20 11:12:00.794434922 +0000 UTC m=+1098.575040485" lastFinishedPulling="2026-03-20 11:12:01.87401013 +0000 UTC m=+1099.654615693" observedRunningTime="2026-03-20 11:12:03.31822897 +0000 UTC m=+1101.098834533" watchObservedRunningTime="2026-03-20 11:12:03.63416309 +0000 UTC m=+1101.414768653" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.651836 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z69g4\" (UniqueName: \"kubernetes.io/projected/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-kube-api-access-z69g4\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.675552 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k6g2\" (UniqueName: \"kubernetes.io/projected/498749d1-4031-4083-bb7b-e2640519795e-kube-api-access-6k6g2\") pod \"test-operator-controller-manager-5c5cb9c4d7-njjcr\" (UID: \"498749d1-4031-4083-bb7b-e2640519795e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.685260 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnq47\" (UniqueName: \"kubernetes.io/projected/23e52d31-9d42-439d-95a1-1761dee30f57-kube-api-access-vnq47\") pod \"telemetry-operator-controller-manager-d6b694c5-qwltk\" (UID: \"23e52d31-9d42-439d-95a1-1761dee30f57\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.685495 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpmnz\" (UniqueName: \"kubernetes.io/projected/0b97d7e0-8c0d-425f-927c-cdf926f3b9fb-kube-api-access-kpmnz\") pod \"placement-operator-controller-manager-5784578c99-hxjpw\" (UID: \"0b97d7e0-8c0d-425f-927c-cdf926f3b9fb\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.687029 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.688387 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.688605 4695 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.688689 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:04.188662585 +0000 UTC m=+1101.969268158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "webhook-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.691740 4695 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: E0320 11:12:03.691813 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:04.191789277 +0000 UTC m=+1101.972394840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "metrics-server-cert" not found Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.697811 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zljj\" (UniqueName: \"kubernetes.io/projected/f4807ed5-3bee-42a3-a23b-e7473fc1b833-kube-api-access-8zljj\") pod \"watcher-operator-controller-manager-6c4d75f7f9-hvnnc\" (UID: \"f4807ed5-3bee-42a3-a23b-e7473fc1b833\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.754866 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z69g4\" (UniqueName: \"kubernetes.io/projected/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-kube-api-access-z69g4\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.799755 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.819315 4695 generic.go:334] "Generic (PLEG): container finished" podID="df3aebe2-0698-4648-ac7c-eae261c6f8c1" containerID="6d18f6f009b8b5cc1f2e51458fdd68cd5989eebbe5be8f69feb25f4cbb4809dc" exitCode=0 Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.819576 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" event={"ID":"df3aebe2-0698-4648-ac7c-eae261c6f8c1","Type":"ContainerDied","Data":"6d18f6f009b8b5cc1f2e51458fdd68cd5989eebbe5be8f69feb25f4cbb4809dc"} Mar 20 11:12:03 crc kubenswrapper[4695]: I0320 11:12:03.868899 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.081283 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.089646 4695 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.089779 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert podName:16ad72ba-9b7f-47fc-8216-147e439de734 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:05.089747007 +0000 UTC m=+1102.870352570 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" (UID: "16ad72ba-9b7f-47fc-8216-147e439de734") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.146086 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.243332 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.245702 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.246491 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.257898 4695 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.258039 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:05.258009428 +0000 UTC m=+1103.038614991 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "metrics-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.258291 4695 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.258333 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert podName:fd90b802-5cbc-4d48-a76a-2903fab33ef0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:06.258324286 +0000 UTC m=+1104.038929849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert") pod "infra-operator-controller-manager-669fff9c7c-54mmr" (UID: "fd90b802-5cbc-4d48-a76a-2903fab33ef0") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.262978 4695 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: E0320 11:12:04.263065 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:05.26305066 +0000 UTC m=+1103.043656223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "webhook-server-cert" not found Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.421164 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.439339 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.465568 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.492847 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:12:04 crc kubenswrapper[4695]: I0320 11:12:04.508781 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:04.972581 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68"] Mar 20 11:12:05 crc kubenswrapper[4695]: W0320 11:12:05.079901 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6a059b0_61e0_4592_8661_e480f9573c66.slice/crio-efa4b70db1bab2940cb9a3cf67443939c71131fb58ae2aa89648c9a0cf29d376 WatchSource:0}: Error finding container efa4b70db1bab2940cb9a3cf67443939c71131fb58ae2aa89648c9a0cf29d376: Status 404 returned error can't find the container with id efa4b70db1bab2940cb9a3cf67443939c71131fb58ae2aa89648c9a0cf29d376 Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.219573 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:05 crc kubenswrapper[4695]: E0320 11:12:05.219844 4695 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:05 crc kubenswrapper[4695]: E0320 11:12:05.219972 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert podName:16ad72ba-9b7f-47fc-8216-147e439de734 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:07.219893506 +0000 UTC m=+1105.000499069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" (UID: "16ad72ba-9b7f-47fc-8216-147e439de734") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.230070 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch"] Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.321137 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.321281 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:05 crc kubenswrapper[4695]: E0320 11:12:05.321428 4695 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:05 crc kubenswrapper[4695]: E0320 11:12:05.321480 4695 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:05 crc kubenswrapper[4695]: E0320 11:12:05.321538 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:07.321512632 +0000 UTC m=+1105.102118195 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "metrics-server-cert" not found Mar 20 11:12:05 crc kubenswrapper[4695]: E0320 11:12:05.321562 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:07.321554123 +0000 UTC m=+1105.102159696 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "webhook-server-cert" not found Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.323740 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g"] Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.784982 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw"] Mar 20 11:12:05 crc kubenswrapper[4695]: W0320 11:12:05.811275 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4f4cf94_cced_45aa_9d30_2a60e6a9e291.slice/crio-b1f481cb8940fc55d62ab8978abe90b7f0baec6db952557b5ac67031968ba449 WatchSource:0}: Error finding container b1f481cb8940fc55d62ab8978abe90b7f0baec6db952557b5ac67031968ba449: Status 404 returned error can't find the container with id b1f481cb8940fc55d62ab8978abe90b7f0baec6db952557b5ac67031968ba449 Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.858441 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk"] Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.871144 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-5t427"] Mar 20 11:12:05 crc kubenswrapper[4695]: W0320 11:12:05.885549 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd46fd923_64a9_48cf_b3ea_05d6a676d7e1.slice/crio-82dbdae4d3f5b55acc2bcf7b01a8091a36d31d49ad650df8d3a5fb5eee774527 WatchSource:0}: Error finding container 82dbdae4d3f5b55acc2bcf7b01a8091a36d31d49ad650df8d3a5fb5eee774527: Status 404 returned error can't find the container with id 82dbdae4d3f5b55acc2bcf7b01a8091a36d31d49ad650df8d3a5fb5eee774527 Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.890332 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-w62g8"] Mar 20 11:12:05 crc kubenswrapper[4695]: I0320 11:12:05.928175 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.000853 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.004314 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" event={"ID":"a6a059b0-61e0-4592-8661-e480f9573c66","Type":"ContainerStarted","Data":"efa4b70db1bab2940cb9a3cf67443939c71131fb58ae2aa89648c9a0cf29d376"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.006694 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" event={"ID":"58250fd6-7e5e-429d-907a-c0f2725f029f","Type":"ContainerStarted","Data":"f4847fe41542a390d66f2b29eabb337d3c6364a8b183cc6265b8a37efa09bcd0"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.008833 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" event={"ID":"bef240d4-6041-44e9-8228-f707a5f2f8eb","Type":"ContainerStarted","Data":"9c024ca7b3aeb7322430c6e92812e64a076cbd3121221463e7375cca2d56261b"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.015262 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" event={"ID":"d46fd923-64a9-48cf-b3ea-05d6a676d7e1","Type":"ContainerStarted","Data":"82dbdae4d3f5b55acc2bcf7b01a8091a36d31d49ad650df8d3a5fb5eee774527"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.020608 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.027712 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" event={"ID":"c4f4cf94-cced-45aa-9d30-2a60e6a9e291","Type":"ContainerStarted","Data":"b1f481cb8940fc55d62ab8978abe90b7f0baec6db952557b5ac67031968ba449"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.029685 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" event={"ID":"99970094-eb53-4489-ba1d-1f650470c848","Type":"ContainerStarted","Data":"2a0fe9638b2ee22fd630d6f92340b03a93e6464903b3490e4c69f639d9df881b"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.031411 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.032210 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" event={"ID":"9a8f730f-c9f3-4467-8b90-cfddd028ee71","Type":"ContainerStarted","Data":"4089a6a9d838a43a5f432d56afc160542fe2469ad0efd586cbb62b450287b111"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.033233 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" event={"ID":"5ce3aca1-15ad-43a5-be8f-b7c5580fcb59","Type":"ContainerStarted","Data":"85a77022e1c4444a7abdc51ed9ab68c05ecf95f72526c0029352985270cbd9cd"} Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.066576 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.114338 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.163374 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qhvj\" (UniqueName: \"kubernetes.io/projected/df3aebe2-0698-4648-ac7c-eae261c6f8c1-kube-api-access-2qhvj\") pod \"df3aebe2-0698-4648-ac7c-eae261c6f8c1\" (UID: \"df3aebe2-0698-4648-ac7c-eae261c6f8c1\") " Mar 20 11:12:06 crc kubenswrapper[4695]: W0320 11:12:06.165983 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4807ed5_3bee_42a3_a23b_e7473fc1b833.slice/crio-19373102eb4c076cb57123184e2bdf518dedddb8ba4f58c5dbbdbff3f0aaf10f WatchSource:0}: Error finding container 19373102eb4c076cb57123184e2bdf518dedddb8ba4f58c5dbbdbff3f0aaf10f: Status 404 returned error can't find the container with id 19373102eb4c076cb57123184e2bdf518dedddb8ba4f58c5dbbdbff3f0aaf10f Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.173751 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3aebe2-0698-4648-ac7c-eae261c6f8c1-kube-api-access-2qhvj" (OuterVolumeSpecName: "kube-api-access-2qhvj") pod "df3aebe2-0698-4648-ac7c-eae261c6f8c1" (UID: "df3aebe2-0698-4648-ac7c-eae261c6f8c1"). InnerVolumeSpecName "kube-api-access-2qhvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.196582 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.255451 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.268956 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.269075 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qhvj\" (UniqueName: \"kubernetes.io/projected/df3aebe2-0698-4648-ac7c-eae261c6f8c1-kube-api-access-2qhvj\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.269234 4695 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.269309 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert podName:fd90b802-5cbc-4d48-a76a-2903fab33ef0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:10.26927793 +0000 UTC m=+1108.049883493 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert") pod "infra-operator-controller-manager-669fff9c7c-54mmr" (UID: "fd90b802-5cbc-4d48-a76a-2903fab33ef0") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.337543 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.354201 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-z62m6"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.365086 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n"] Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.387973 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n7jct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-z62m6_openstack-operators(d2a6f843-5ef0-48d2-9582-4c56551531a9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.389309 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" podUID="d2a6f843-5ef0-48d2-9582-4c56551531a9" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.393760 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5dmbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-lnj2n_openstack-operators(4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.395104 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" podUID="4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.397326 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kpmnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-hxjpw_openstack-operators(0b97d7e0-8c0d-425f-927c-cdf926f3b9fb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.398415 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" podUID="0b97d7e0-8c0d-425f-927c-cdf926f3b9fb" Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.403779 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.412111 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw"] Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.413988 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5n7mh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-zs6kl_openstack-operators(a7f91c18-4219-4040-b474-4d38d377071a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:06 crc kubenswrapper[4695]: E0320 11:12:06.415236 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" podUID="a7f91c18-4219-4040-b474-4d38d377071a" Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.426609 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-x62p7"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.431708 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-x62p7"] Mar 20 11:12:06 crc kubenswrapper[4695]: I0320 11:12:06.911674 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1891389e-4ca3-4ba6-9eff-b2ee13a43180" path="/var/lib/kubelet/pods/1891389e-4ca3-4ba6-9eff-b2ee13a43180/volumes" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.066252 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" event={"ID":"a7f91c18-4219-4040-b474-4d38d377071a","Type":"ContainerStarted","Data":"ab7a64be2962047fccad1310cbfd75701e4f9e601141458deefac33a01db3efb"} Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.068650 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" podUID="a7f91c18-4219-4040-b474-4d38d377071a" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.101608 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" event={"ID":"498749d1-4031-4083-bb7b-e2640519795e","Type":"ContainerStarted","Data":"f416022b21b335dbf6d5a3f416a1f69d553b598697835e58fa05f56216c693ba"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.108824 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" event={"ID":"d2a6f843-5ef0-48d2-9582-4c56551531a9","Type":"ContainerStarted","Data":"74ae222fed657edfd3a1daca6c0d2f8b60d3abdfed6cfadfa54af860cd93aa90"} Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.115890 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" podUID="d2a6f843-5ef0-48d2-9582-4c56551531a9" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.162656 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" event={"ID":"f4807ed5-3bee-42a3-a23b-e7473fc1b833","Type":"ContainerStarted","Data":"19373102eb4c076cb57123184e2bdf518dedddb8ba4f58c5dbbdbff3f0aaf10f"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.220674 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" event={"ID":"4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a","Type":"ContainerStarted","Data":"ff82d30f4aa5394acdb3a4653ef21dbcb5fd6417281dcf93bfa0cae6783efbc5"} Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.228052 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" podUID="4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.231525 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" event={"ID":"7e6a711c-3208-459c-9f80-f29d5bbd0177","Type":"ContainerStarted","Data":"49a9d07a4efbafbc43f2ac82a0cdebef79c5d9364dcc7c3c152bbd3ed10cc493"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.239612 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" event={"ID":"df3aebe2-0698-4648-ac7c-eae261c6f8c1","Type":"ContainerDied","Data":"50c4e3f1f03768a28d1936b8ecf058b9574f502bef7cbad5f9bfac8e5c903e66"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.239791 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-ws9vl" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.250945 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50c4e3f1f03768a28d1936b8ecf058b9574f502bef7cbad5f9bfac8e5c903e66" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.251016 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" event={"ID":"39cbf988-66c1-4ac9-9595-3cf263cde0aa","Type":"ContainerStarted","Data":"8bc69cb3ea25627b5a5a7b86d1dec8311345ea6cc1de70acb9ba15d5390d1594"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.258155 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" event={"ID":"26fc4733-51dc-4a8d-ba8e-03bd966cac17","Type":"ContainerStarted","Data":"ff6133a128fb313a2843c8fe947259a9ecf095e1d327f0ab8d9a65daef11e8bf"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.259612 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" event={"ID":"0b97d7e0-8c0d-425f-927c-cdf926f3b9fb","Type":"ContainerStarted","Data":"010c08ac76076ccbd8489fa6067591cb89d1b55099cb96ebd5b616398fe01fe1"} Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.264057 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" podUID="0b97d7e0-8c0d-425f-927c-cdf926f3b9fb" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.277412 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" event={"ID":"9ab1dbad-3ea8-4ed7-9284-d27f1516c26c","Type":"ContainerStarted","Data":"0396bb4ab823d039bd4e28b07233958bf8b21b75e2646bc3354e2e90ad13c475"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.295035 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" event={"ID":"23e52d31-9d42-439d-95a1-1761dee30f57","Type":"ContainerStarted","Data":"924a33bf266be32644f1fc6d7ba93aca206f9e4bbe873504a5957178bb558472"} Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.307532 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.308109 4695 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.308219 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert podName:16ad72ba-9b7f-47fc-8216-147e439de734 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:11.308191154 +0000 UTC m=+1109.088796717 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" (UID: "16ad72ba-9b7f-47fc-8216-147e439de734") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.409792 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:07 crc kubenswrapper[4695]: I0320 11:12:07.410038 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.410083 4695 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.410158 4695 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.410189 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:11.410160128 +0000 UTC m=+1109.190765691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "metrics-server-cert" not found Mar 20 11:12:07 crc kubenswrapper[4695]: E0320 11:12:07.410212 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:11.410202719 +0000 UTC m=+1109.190808292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "webhook-server-cert" not found Mar 20 11:12:08 crc kubenswrapper[4695]: E0320 11:12:08.371092 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" podUID="4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a" Mar 20 11:12:08 crc kubenswrapper[4695]: E0320 11:12:08.371234 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" podUID="a7f91c18-4219-4040-b474-4d38d377071a" Mar 20 11:12:08 crc kubenswrapper[4695]: E0320 11:12:08.438782 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" podUID="0b97d7e0-8c0d-425f-927c-cdf926f3b9fb" Mar 20 11:12:08 crc kubenswrapper[4695]: E0320 11:12:08.439016 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" podUID="d2a6f843-5ef0-48d2-9582-4c56551531a9" Mar 20 11:12:10 crc kubenswrapper[4695]: I0320 11:12:10.325077 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:10 crc kubenswrapper[4695]: E0320 11:12:10.325707 4695 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:10 crc kubenswrapper[4695]: E0320 11:12:10.325785 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert podName:fd90b802-5cbc-4d48-a76a-2903fab33ef0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:18.325766202 +0000 UTC m=+1116.106371775 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert") pod "infra-operator-controller-manager-669fff9c7c-54mmr" (UID: "fd90b802-5cbc-4d48-a76a-2903fab33ef0") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4695]: I0320 11:12:11.394947 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:11 crc kubenswrapper[4695]: E0320 11:12:11.395268 4695 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4695]: E0320 11:12:11.395499 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert podName:16ad72ba-9b7f-47fc-8216-147e439de734 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.395472253 +0000 UTC m=+1117.176077816 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" (UID: "16ad72ba-9b7f-47fc-8216-147e439de734") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4695]: I0320 11:12:11.497219 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:11 crc kubenswrapper[4695]: I0320 11:12:11.497776 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:11 crc kubenswrapper[4695]: E0320 11:12:11.498027 4695 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4695]: E0320 11:12:11.498129 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.498106334 +0000 UTC m=+1117.278711897 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "metrics-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4695]: E0320 11:12:11.498606 4695 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:11 crc kubenswrapper[4695]: E0320 11:12:11.498639 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs podName:53c876bd-49e1-4e7c-9673-91ebcd6b19a0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.498629147 +0000 UTC m=+1117.279234710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-5w2r2" (UID: "53c876bd-49e1-4e7c-9673-91ebcd6b19a0") : secret "webhook-server-cert" not found Mar 20 11:12:18 crc kubenswrapper[4695]: I0320 11:12:18.386003 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:18 crc kubenswrapper[4695]: E0320 11:12:18.386303 4695 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:18 crc kubenswrapper[4695]: E0320 11:12:18.386980 4695 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert podName:fd90b802-5cbc-4d48-a76a-2903fab33ef0 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:34.386945454 +0000 UTC m=+1132.167551017 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert") pod "infra-operator-controller-manager-669fff9c7c-54mmr" (UID: "fd90b802-5cbc-4d48-a76a-2903fab33ef0") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.403528 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.411534 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/16ad72ba-9b7f-47fc-8216-147e439de734-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-m4c8w\" (UID: \"16ad72ba-9b7f-47fc-8216-147e439de734\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.505091 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.505633 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.509791 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.514738 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/53c876bd-49e1-4e7c-9673-91ebcd6b19a0-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-5w2r2\" (UID: \"53c876bd-49e1-4e7c-9673-91ebcd6b19a0\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.544875 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:19 crc kubenswrapper[4695]: I0320 11:12:19.682267 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:12:21 crc kubenswrapper[4695]: E0320 11:12:21.771421 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 20 11:12:21 crc kubenswrapper[4695]: E0320 11:12:21.772050 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xddgv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-w62g8_openstack-operators(5ce3aca1-15ad-43a5-be8f-b7c5580fcb59): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:21 crc kubenswrapper[4695]: E0320 11:12:21.773259 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" podUID="5ce3aca1-15ad-43a5-be8f-b7c5580fcb59" Mar 20 11:12:21 crc kubenswrapper[4695]: E0320 11:12:21.878976 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" podUID="5ce3aca1-15ad-43a5-be8f-b7c5580fcb59" Mar 20 11:12:26 crc kubenswrapper[4695]: E0320 11:12:26.152429 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a" Mar 20 11:12:26 crc kubenswrapper[4695]: E0320 11:12:26.153276 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wm6jh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-4q99q_openstack-operators(26fc4733-51dc-4a8d-ba8e-03bd966cac17): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:26 crc kubenswrapper[4695]: E0320 11:12:26.154470 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" podUID="26fc4733-51dc-4a8d-ba8e-03bd966cac17" Mar 20 11:12:27 crc kubenswrapper[4695]: E0320 11:12:27.124825 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" podUID="26fc4733-51dc-4a8d-ba8e-03bd966cac17" Mar 20 11:12:27 crc kubenswrapper[4695]: E0320 11:12:27.319382 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d" Mar 20 11:12:27 crc kubenswrapper[4695]: E0320 11:12:27.319703 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dflc4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-59bc569d95-l29bw_openstack-operators(c4f4cf94-cced-45aa-9d30-2a60e6a9e291): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:27 crc kubenswrapper[4695]: E0320 11:12:27.320939 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" podUID="c4f4cf94-cced-45aa-9d30-2a60e6a9e291" Mar 20 11:12:28 crc kubenswrapper[4695]: E0320 11:12:28.216202 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:7562d3e09bdac17f447f4523c5bd784c5f5ab5ca9cb2370a03b86126d6d7301d\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" podUID="c4f4cf94-cced-45aa-9d30-2a60e6a9e291" Mar 20 11:12:28 crc kubenswrapper[4695]: E0320 11:12:28.314644 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 11:12:28 crc kubenswrapper[4695]: E0320 11:12:28.314902 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xx6cp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-5t427_openstack-operators(58250fd6-7e5e-429d-907a-c0f2725f029f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:28 crc kubenswrapper[4695]: E0320 11:12:28.316090 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" podUID="58250fd6-7e5e-429d-907a-c0f2725f029f" Mar 20 11:12:29 crc kubenswrapper[4695]: E0320 11:12:29.111381 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 11:12:29 crc kubenswrapper[4695]: E0320 11:12:29.111702 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l44bw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-2l9vk_openstack-operators(bef240d4-6041-44e9-8228-f707a5f2f8eb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:29 crc kubenswrapper[4695]: E0320 11:12:29.112947 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" podUID="bef240d4-6041-44e9-8228-f707a5f2f8eb" Mar 20 11:12:29 crc kubenswrapper[4695]: E0320 11:12:29.224625 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" podUID="bef240d4-6041-44e9-8228-f707a5f2f8eb" Mar 20 11:12:29 crc kubenswrapper[4695]: E0320 11:12:29.225711 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" podUID="58250fd6-7e5e-429d-907a-c0f2725f029f" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.076220 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.077305 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ftrm6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-sjz6g_openstack-operators(99970094-eb53-4489-ba1d-1f650470c848): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.080190 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" podUID="99970094-eb53-4489-ba1d-1f650470c848" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.266631 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" podUID="99970094-eb53-4489-ba1d-1f650470c848" Mar 20 11:12:34 crc kubenswrapper[4695]: I0320 11:12:34.461803 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:34 crc kubenswrapper[4695]: I0320 11:12:34.473177 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/fd90b802-5cbc-4d48-a76a-2903fab33ef0-cert\") pod \"infra-operator-controller-manager-669fff9c7c-54mmr\" (UID: \"fd90b802-5cbc-4d48-a76a-2903fab33ef0\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:34 crc kubenswrapper[4695]: I0320 11:12:34.665270 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.737334 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.737620 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6k6g2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-njjcr_openstack-operators(498749d1-4031-4083-bb7b-e2640519795e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:34 crc kubenswrapper[4695]: E0320 11:12:34.738895 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" podUID="498749d1-4031-4083-bb7b-e2640519795e" Mar 20 11:12:35 crc kubenswrapper[4695]: E0320 11:12:35.284744 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" podUID="498749d1-4031-4083-bb7b-e2640519795e" Mar 20 11:12:35 crc kubenswrapper[4695]: E0320 11:12:35.763349 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 20 11:12:35 crc kubenswrapper[4695]: E0320 11:12:35.763632 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vbfwn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-cg8tn_openstack-operators(d46fd923-64a9-48cf-b3ea-05d6a676d7e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:35 crc kubenswrapper[4695]: E0320 11:12:35.765538 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" podUID="d46fd923-64a9-48cf-b3ea-05d6a676d7e1" Mar 20 11:12:36 crc kubenswrapper[4695]: E0320 11:12:36.283165 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" podUID="d46fd923-64a9-48cf-b3ea-05d6a676d7e1" Mar 20 11:12:36 crc kubenswrapper[4695]: E0320 11:12:36.805591 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 20 11:12:36 crc kubenswrapper[4695]: E0320 11:12:36.805937 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8zljj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-hvnnc_openstack-operators(f4807ed5-3bee-42a3-a23b-e7473fc1b833): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:36 crc kubenswrapper[4695]: E0320 11:12:36.807218 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" podUID="f4807ed5-3bee-42a3-a23b-e7473fc1b833" Mar 20 11:12:37 crc kubenswrapper[4695]: E0320 11:12:37.292465 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" podUID="f4807ed5-3bee-42a3-a23b-e7473fc1b833" Mar 20 11:12:37 crc kubenswrapper[4695]: E0320 11:12:37.733078 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 20 11:12:37 crc kubenswrapper[4695]: E0320 11:12:37.733301 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjt9z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-ws2ch_openstack-operators(9a8f730f-c9f3-4467-8b90-cfddd028ee71): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:37 crc kubenswrapper[4695]: E0320 11:12:37.735410 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" podUID="9a8f730f-c9f3-4467-8b90-cfddd028ee71" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.298393 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" podUID="9a8f730f-c9f3-4467-8b90-cfddd028ee71" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.450045 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.450295 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v4cqx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-zcgwb_openstack-operators(9ab1dbad-3ea8-4ed7-9284-d27f1516c26c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.451751 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" podUID="9ab1dbad-3ea8-4ed7-9284-d27f1516c26c" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.917405 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.917628 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rhjgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-tcghf_openstack-operators(39cbf988-66c1-4ac9-9595-3cf263cde0aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:38 crc kubenswrapper[4695]: E0320 11:12:38.918816 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" podUID="39cbf988-66c1-4ac9-9595-3cf263cde0aa" Mar 20 11:12:39 crc kubenswrapper[4695]: E0320 11:12:39.310831 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" podUID="39cbf988-66c1-4ac9-9595-3cf263cde0aa" Mar 20 11:12:39 crc kubenswrapper[4695]: E0320 11:12:39.311066 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" podUID="9ab1dbad-3ea8-4ed7-9284-d27f1516c26c" Mar 20 11:12:39 crc kubenswrapper[4695]: I0320 11:12:39.958988 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w"] Mar 20 11:12:40 crc kubenswrapper[4695]: W0320 11:12:40.141119 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c876bd_49e1_4e7c_9673_91ebcd6b19a0.slice/crio-0c04fd84dddaa996d2bc916eff71104cb7531734bbaca568dd17fd8481f9c13d WatchSource:0}: Error finding container 0c04fd84dddaa996d2bc916eff71104cb7531734bbaca568dd17fd8481f9c13d: Status 404 returned error can't find the container with id 0c04fd84dddaa996d2bc916eff71104cb7531734bbaca568dd17fd8481f9c13d Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.142034 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2"] Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.225242 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr"] Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.331140 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" event={"ID":"d2a6f843-5ef0-48d2-9582-4c56551531a9","Type":"ContainerStarted","Data":"5afda38fde332b5e313a7727394cfd47d3bddf0fb6f6c77bfa35abecfccdad46"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.332482 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.333900 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" event={"ID":"7e6a711c-3208-459c-9f80-f29d5bbd0177","Type":"ContainerStarted","Data":"10d2dfba0a1dff5d62cfd8ec97dd4fed37751581dfdffa52183671a81722e61c"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.334444 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.335657 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" event={"ID":"c4f4cf94-cced-45aa-9d30-2a60e6a9e291","Type":"ContainerStarted","Data":"1bd2b4d617803b0a4a3529c412e7f9d207fc75aafbd3da282a52fe381b10fb21"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.336073 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.337147 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" event={"ID":"26fc4733-51dc-4a8d-ba8e-03bd966cac17","Type":"ContainerStarted","Data":"0cde59b27370339839553c8f2784ec001e78b0d30feed1c88c2ce79bca656bc5"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.337538 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.338507 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" event={"ID":"5ce3aca1-15ad-43a5-be8f-b7c5580fcb59","Type":"ContainerStarted","Data":"28c53dc31b405c111f2cc0fea93b7db5103674da21675e718c85cf52202332c0"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.338934 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.339683 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" event={"ID":"53c876bd-49e1-4e7c-9673-91ebcd6b19a0","Type":"ContainerStarted","Data":"0c04fd84dddaa996d2bc916eff71104cb7531734bbaca568dd17fd8481f9c13d"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.348131 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" event={"ID":"0b97d7e0-8c0d-425f-927c-cdf926f3b9fb","Type":"ContainerStarted","Data":"2740f441a340131ecfae25f5546fd62b83c6c65afce2aa07087f13abffbeb8e4"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.348730 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.350253 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" event={"ID":"4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a","Type":"ContainerStarted","Data":"871f42cf24d36bf294d80e3d12e2475182243076d389f7265546da838db63ffc"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.351014 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.351851 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" event={"ID":"fd90b802-5cbc-4d48-a76a-2903fab33ef0","Type":"ContainerStarted","Data":"f2e8562482217f0f095e631440bd2e7e2ca70bed2243296e2fdd0caa2cb6cd50"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.352777 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" event={"ID":"23e52d31-9d42-439d-95a1-1761dee30f57","Type":"ContainerStarted","Data":"b5cb4ab1d60af3596259d1b1f000aa90dc3faef4941d2bad903f04538c225e36"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.353178 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.353878 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" event={"ID":"16ad72ba-9b7f-47fc-8216-147e439de734","Type":"ContainerStarted","Data":"35da4a1ac56bfbbf8238634490cd21dfd51336a6034b82a527f8a75404e06b6e"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.354737 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" event={"ID":"a7f91c18-4219-4040-b474-4d38d377071a","Type":"ContainerStarted","Data":"0980adba9e9b4eb2f11cb9aac7cdcb28d936867b3df00753eb258a017f4ae36a"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.355309 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.361160 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" event={"ID":"a6a059b0-61e0-4592-8661-e480f9573c66","Type":"ContainerStarted","Data":"6807e17e6e71185be0eb45a1548b4a2927fadb4c85fce36e5a56eb64850e5783"} Mar 20 11:12:40 crc kubenswrapper[4695]: I0320 11:12:40.361751 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.029481 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" podStartSLOduration=6.034710139 podStartE2EDuration="39.029454262s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.387664754 +0000 UTC m=+1104.168270317" lastFinishedPulling="2026-03-20 11:12:39.382408867 +0000 UTC m=+1137.163014440" observedRunningTime="2026-03-20 11:12:40.69621308 +0000 UTC m=+1138.476818663" watchObservedRunningTime="2026-03-20 11:12:41.029454262 +0000 UTC m=+1138.810059825" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.031020 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" podStartSLOduration=5.933597286 podStartE2EDuration="39.031009311s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.397162216 +0000 UTC m=+1104.177767779" lastFinishedPulling="2026-03-20 11:12:39.494574241 +0000 UTC m=+1137.275179804" observedRunningTime="2026-03-20 11:12:41.026397134 +0000 UTC m=+1138.807002697" watchObservedRunningTime="2026-03-20 11:12:41.031009311 +0000 UTC m=+1138.811614874" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.044030 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" podStartSLOduration=5.819098272 podStartE2EDuration="39.044013924s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.342274124 +0000 UTC m=+1104.122879687" lastFinishedPulling="2026-03-20 11:12:39.567189776 +0000 UTC m=+1137.347795339" observedRunningTime="2026-03-20 11:12:41.042846504 +0000 UTC m=+1138.823452067" watchObservedRunningTime="2026-03-20 11:12:41.044013924 +0000 UTC m=+1138.824619487" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.070969 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" podStartSLOduration=5.375177954 podStartE2EDuration="39.070937151s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.840257953 +0000 UTC m=+1103.620863516" lastFinishedPulling="2026-03-20 11:12:39.53601715 +0000 UTC m=+1137.316622713" observedRunningTime="2026-03-20 11:12:41.068758375 +0000 UTC m=+1138.849363938" watchObservedRunningTime="2026-03-20 11:12:41.070937151 +0000 UTC m=+1138.851542714" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.105273 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" podStartSLOduration=5.546597891 podStartE2EDuration="39.105252377s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.885309433 +0000 UTC m=+1103.665914986" lastFinishedPulling="2026-03-20 11:12:39.443963909 +0000 UTC m=+1137.224569472" observedRunningTime="2026-03-20 11:12:41.098032393 +0000 UTC m=+1138.878637956" watchObservedRunningTime="2026-03-20 11:12:41.105252377 +0000 UTC m=+1138.885857940" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.152820 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" podStartSLOduration=6.07628764 podStartE2EDuration="39.152780681s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.41375614 +0000 UTC m=+1104.194361703" lastFinishedPulling="2026-03-20 11:12:39.490249181 +0000 UTC m=+1137.270854744" observedRunningTime="2026-03-20 11:12:41.150490753 +0000 UTC m=+1138.931096316" watchObservedRunningTime="2026-03-20 11:12:41.152780681 +0000 UTC m=+1138.933386244" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.256360 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" podStartSLOduration=6.226717183 podStartE2EDuration="39.256338096s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.393529404 +0000 UTC m=+1104.174134967" lastFinishedPulling="2026-03-20 11:12:39.423150317 +0000 UTC m=+1137.203755880" observedRunningTime="2026-03-20 11:12:41.248895436 +0000 UTC m=+1139.029500999" watchObservedRunningTime="2026-03-20 11:12:41.256338096 +0000 UTC m=+1139.036943659" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.270360 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" podStartSLOduration=7.865265633 podStartE2EDuration="39.270335744s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.314366632 +0000 UTC m=+1104.094972195" lastFinishedPulling="2026-03-20 11:12:37.719436743 +0000 UTC m=+1135.500042306" observedRunningTime="2026-03-20 11:12:41.265670004 +0000 UTC m=+1139.046275567" watchObservedRunningTime="2026-03-20 11:12:41.270335744 +0000 UTC m=+1139.050941307" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.302934 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" podStartSLOduration=6.70089616 podStartE2EDuration="39.302887615s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.117152911 +0000 UTC m=+1102.897758484" lastFinishedPulling="2026-03-20 11:12:37.719144376 +0000 UTC m=+1135.499749939" observedRunningTime="2026-03-20 11:12:41.301328015 +0000 UTC m=+1139.081933588" watchObservedRunningTime="2026-03-20 11:12:41.302887615 +0000 UTC m=+1139.083493178" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.331668 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" podStartSLOduration=7.684899175 podStartE2EDuration="39.331643449s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.072270928 +0000 UTC m=+1103.852876491" lastFinishedPulling="2026-03-20 11:12:37.719015212 +0000 UTC m=+1135.499620765" observedRunningTime="2026-03-20 11:12:41.323680756 +0000 UTC m=+1139.104286329" watchObservedRunningTime="2026-03-20 11:12:41.331643449 +0000 UTC m=+1139.112249012" Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.386572 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" event={"ID":"53c876bd-49e1-4e7c-9673-91ebcd6b19a0","Type":"ContainerStarted","Data":"504248abd57c5def74e8850ff44c039eb4b4353b8da95a1575daabc378c2c529"} Mar 20 11:12:41 crc kubenswrapper[4695]: I0320 11:12:41.613078 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" podStartSLOduration=38.613050867 podStartE2EDuration="38.613050867s" podCreationTimestamp="2026-03-20 11:12:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:12:41.60416063 +0000 UTC m=+1139.384766193" watchObservedRunningTime="2026-03-20 11:12:41.613050867 +0000 UTC m=+1139.393656430" Mar 20 11:12:42 crc kubenswrapper[4695]: I0320 11:12:42.395425 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:43 crc kubenswrapper[4695]: I0320 11:12:43.411486 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" event={"ID":"58250fd6-7e5e-429d-907a-c0f2725f029f","Type":"ContainerStarted","Data":"37d308e1809cd1396ca157c5869b42be4d0db5cd7adef1ba219f0162387776ea"} Mar 20 11:12:43 crc kubenswrapper[4695]: I0320 11:12:43.412702 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:43 crc kubenswrapper[4695]: I0320 11:12:43.435727 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" podStartSLOduration=4.79629585 podStartE2EDuration="41.435696918s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.899394403 +0000 UTC m=+1103.679999966" lastFinishedPulling="2026-03-20 11:12:42.538795471 +0000 UTC m=+1140.319401034" observedRunningTime="2026-03-20 11:12:43.435487272 +0000 UTC m=+1141.216092835" watchObservedRunningTime="2026-03-20 11:12:43.435696918 +0000 UTC m=+1141.216302481" Mar 20 11:12:44 crc kubenswrapper[4695]: I0320 11:12:44.370656 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-lnj2n" Mar 20 11:12:44 crc kubenswrapper[4695]: I0320 11:12:44.577664 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-qwltk" Mar 20 11:12:47 crc kubenswrapper[4695]: I0320 11:12:47.933142 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" event={"ID":"bef240d4-6041-44e9-8228-f707a5f2f8eb","Type":"ContainerStarted","Data":"f121a9012dd55bee1b0b89467624ee517df0b6fb75e220a0a025eaf7700332f0"} Mar 20 11:12:47 crc kubenswrapper[4695]: I0320 11:12:47.934080 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:48 crc kubenswrapper[4695]: I0320 11:12:48.031985 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" podStartSLOduration=8.474401087 podStartE2EDuration="46.031959815s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.886331369 +0000 UTC m=+1103.666936942" lastFinishedPulling="2026-03-20 11:12:43.443890107 +0000 UTC m=+1141.224495670" observedRunningTime="2026-03-20 11:12:48.028529357 +0000 UTC m=+1145.809134920" watchObservedRunningTime="2026-03-20 11:12:48.031959815 +0000 UTC m=+1145.812565378" Mar 20 11:12:49 crc kubenswrapper[4695]: I0320 11:12:49.555358 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-5w2r2" Mar 20 11:12:52 crc kubenswrapper[4695]: I0320 11:12:52.646951 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" Mar 20 11:12:52 crc kubenswrapper[4695]: I0320 11:12:52.683667 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-5t427" Mar 20 11:12:52 crc kubenswrapper[4695]: I0320 11:12:52.791748 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-l29bw" Mar 20 11:12:53 crc kubenswrapper[4695]: I0320 11:12:53.097781 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-2l9vk" Mar 20 11:12:53 crc kubenswrapper[4695]: I0320 11:12:53.395230 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-w62g8" Mar 20 11:12:53 crc kubenswrapper[4695]: I0320 11:12:53.484004 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-m8kxp" Mar 20 11:12:53 crc kubenswrapper[4695]: I0320 11:12:53.532667 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-z62m6" Mar 20 11:12:54 crc kubenswrapper[4695]: I0320 11:12:54.426178 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-zs6kl" Mar 20 11:12:54 crc kubenswrapper[4695]: I0320 11:12:54.537387 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-hxjpw" Mar 20 11:12:54 crc kubenswrapper[4695]: I0320 11:12:54.912816 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" podUID="26fc4733-51dc-4a8d-ba8e-03bd966cac17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.86:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 11:12:57 crc kubenswrapper[4695]: E0320 11:12:57.968209 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/lmiccini/infra-operator@sha256:2dfd7b906f9cb10eb71281389023c33655956619d0d004bcecdb8fb9de5e9fac" Mar 20 11:12:57 crc kubenswrapper[4695]: E0320 11:12:57.968520 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/lmiccini/infra-operator@sha256:2dfd7b906f9cb10eb71281389023c33655956619d0d004bcecdb8fb9de5e9fac,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nntjn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-669fff9c7c-54mmr_openstack-operators(fd90b802-5cbc-4d48-a76a-2903fab33ef0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:57 crc kubenswrapper[4695]: E0320 11:12:57.969824 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" podUID="fd90b802-5cbc-4d48-a76a-2903fab33ef0" Mar 20 11:12:59 crc kubenswrapper[4695]: E0320 11:12:59.518653 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/lmiccini/infra-operator@sha256:2dfd7b906f9cb10eb71281389023c33655956619d0d004bcecdb8fb9de5e9fac\\\"\"" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" podUID="fd90b802-5cbc-4d48-a76a-2903fab33ef0" Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.035113 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" event={"ID":"9ab1dbad-3ea8-4ed7-9284-d27f1516c26c","Type":"ContainerStarted","Data":"1a132579c91edf07ee67d193f19ce5d0ceb22242c2bd37208c4f1c8a92fe70f9"} Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.035899 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.038111 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" event={"ID":"16ad72ba-9b7f-47fc-8216-147e439de734","Type":"ContainerStarted","Data":"2d99500415bd23e32625dd53029eab40b790a29956b3ea12318d8f66b37235ef"} Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.038278 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.040964 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" event={"ID":"99970094-eb53-4489-ba1d-1f650470c848","Type":"ContainerStarted","Data":"8188bd20d4a9ee53402ece16bcc1292966e6419e13eacbed652c2f63c6d943d2"} Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.041333 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.062563 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" podStartSLOduration=4.477533119 podStartE2EDuration="58.062533337s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.061253357 +0000 UTC m=+1103.841858940" lastFinishedPulling="2026-03-20 11:12:59.646253595 +0000 UTC m=+1157.426859158" observedRunningTime="2026-03-20 11:13:00.05717924 +0000 UTC m=+1157.837784823" watchObservedRunningTime="2026-03-20 11:13:00.062533337 +0000 UTC m=+1157.843138900" Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.089103 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" podStartSLOduration=38.439874455 podStartE2EDuration="58.089067595s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:39.994190982 +0000 UTC m=+1137.774796545" lastFinishedPulling="2026-03-20 11:12:59.643384122 +0000 UTC m=+1157.423989685" observedRunningTime="2026-03-20 11:13:00.087382862 +0000 UTC m=+1157.867988435" watchObservedRunningTime="2026-03-20 11:13:00.089067595 +0000 UTC m=+1157.869673158" Mar 20 11:13:00 crc kubenswrapper[4695]: I0320 11:13:00.257260 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" podStartSLOduration=4.232737516 podStartE2EDuration="58.25723051s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.650726281 +0000 UTC m=+1103.431331834" lastFinishedPulling="2026-03-20 11:12:59.675219265 +0000 UTC m=+1157.455824828" observedRunningTime="2026-03-20 11:13:00.256402659 +0000 UTC m=+1158.037008242" watchObservedRunningTime="2026-03-20 11:13:00.25723051 +0000 UTC m=+1158.037836083" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.294639 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" event={"ID":"d46fd923-64a9-48cf-b3ea-05d6a676d7e1","Type":"ContainerStarted","Data":"1997ff506e86dcecffdfacfbf0c7a5047083944c59046c6fd5846f34aaeb76ce"} Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.296362 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.300600 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" event={"ID":"498749d1-4031-4083-bb7b-e2640519795e","Type":"ContainerStarted","Data":"322dac2def2e527768474ea822730cf9530c0d235f8cb5421b7e6a855add378f"} Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.300845 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.302745 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" event={"ID":"f4807ed5-3bee-42a3-a23b-e7473fc1b833","Type":"ContainerStarted","Data":"75abe893a8bae6c3d0b6e93b7ed28e6cff52d5e7cdded90047a9f3b1cd12e54b"} Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.303285 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.308625 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" event={"ID":"9a8f730f-c9f3-4467-8b90-cfddd028ee71","Type":"ContainerStarted","Data":"d15e4a54969706f057d417acecc9148815e5664df651687b6e7afe7a2fea8746"} Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.308986 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.311447 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" event={"ID":"39cbf988-66c1-4ac9-9595-3cf263cde0aa","Type":"ContainerStarted","Data":"3cf7e2886eaa4e04a6938826b533f185d0a518c3ab4a4fe3879470e71f6a33f6"} Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.311851 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.320879 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" podStartSLOduration=5.414109099 podStartE2EDuration="59.320849115s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.898968322 +0000 UTC m=+1103.679573885" lastFinishedPulling="2026-03-20 11:12:59.805708348 +0000 UTC m=+1157.586313901" observedRunningTime="2026-03-20 11:13:01.316375801 +0000 UTC m=+1159.096981364" watchObservedRunningTime="2026-03-20 11:13:01.320849115 +0000 UTC m=+1159.101454678" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.341877 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" podStartSLOduration=4.30608653 podStartE2EDuration="58.341850801s" podCreationTimestamp="2026-03-20 11:12:03 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.17529821 +0000 UTC m=+1103.955903773" lastFinishedPulling="2026-03-20 11:13:00.211062481 +0000 UTC m=+1157.991668044" observedRunningTime="2026-03-20 11:13:01.334812821 +0000 UTC m=+1159.115418384" watchObservedRunningTime="2026-03-20 11:13:01.341850801 +0000 UTC m=+1159.122456364" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.363585 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" podStartSLOduration=5.23441589 podStartE2EDuration="59.363563046s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.06333055 +0000 UTC m=+1103.843936113" lastFinishedPulling="2026-03-20 11:13:00.192477706 +0000 UTC m=+1157.973083269" observedRunningTime="2026-03-20 11:13:01.357812749 +0000 UTC m=+1159.138418312" watchObservedRunningTime="2026-03-20 11:13:01.363563046 +0000 UTC m=+1159.144168599" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.399461 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" podStartSLOduration=5.653999086 podStartE2EDuration="59.399413781s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:06.062962891 +0000 UTC m=+1103.843568454" lastFinishedPulling="2026-03-20 11:12:59.808377586 +0000 UTC m=+1157.588983149" observedRunningTime="2026-03-20 11:13:01.397325608 +0000 UTC m=+1159.177931171" watchObservedRunningTime="2026-03-20 11:13:01.399413781 +0000 UTC m=+1159.180019344" Mar 20 11:13:01 crc kubenswrapper[4695]: I0320 11:13:01.429008 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" podStartSLOduration=4.565407462 podStartE2EDuration="59.428983287s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:05.346981983 +0000 UTC m=+1103.127587546" lastFinishedPulling="2026-03-20 11:13:00.210557808 +0000 UTC m=+1157.991163371" observedRunningTime="2026-03-20 11:13:01.425974 +0000 UTC m=+1159.206579563" watchObservedRunningTime="2026-03-20 11:13:01.428983287 +0000 UTC m=+1159.209588860" Mar 20 11:13:03 crc kubenswrapper[4695]: I0320 11:13:03.872953 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-4q99q" Mar 20 11:13:05 crc kubenswrapper[4695]: I0320 11:13:05.597241 4695 scope.go:117] "RemoveContainer" containerID="319e51e37f6eb537ce8bc45a7d54556e4719959f61059a83f46a20d3656f8dd9" Mar 20 11:13:08 crc kubenswrapper[4695]: I0320 11:13:08.431526 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:13:08 crc kubenswrapper[4695]: I0320 11:13:08.432026 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:13:09 crc kubenswrapper[4695]: I0320 11:13:09.690500 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-m4c8w" Mar 20 11:13:13 crc kubenswrapper[4695]: I0320 11:13:13.116672 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cg8tn" Mar 20 11:13:13 crc kubenswrapper[4695]: I0320 11:13:13.142248 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-sjz6g" Mar 20 11:13:13 crc kubenswrapper[4695]: I0320 11:13:13.202637 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-ws2ch" Mar 20 11:13:13 crc kubenswrapper[4695]: I0320 11:13:13.409405 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-tcghf" Mar 20 11:13:14 crc kubenswrapper[4695]: I0320 11:13:13.806954 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-zcgwb" Mar 20 11:13:14 crc kubenswrapper[4695]: I0320 11:13:14.438362 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" event={"ID":"fd90b802-5cbc-4d48-a76a-2903fab33ef0","Type":"ContainerStarted","Data":"c3736fc7fe4e10506fd8fb53292c6d54bea1674057a5301520e0103081665395"} Mar 20 11:13:14 crc kubenswrapper[4695]: I0320 11:13:14.439190 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:13:14 crc kubenswrapper[4695]: I0320 11:13:14.463288 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" podStartSLOduration=39.296347799 podStartE2EDuration="1m12.46325719s" podCreationTimestamp="2026-03-20 11:12:02 +0000 UTC" firstStartedPulling="2026-03-20 11:12:40.246087385 +0000 UTC m=+1138.026692948" lastFinishedPulling="2026-03-20 11:13:13.412996776 +0000 UTC m=+1171.193602339" observedRunningTime="2026-03-20 11:13:14.460974862 +0000 UTC m=+1172.241580435" watchObservedRunningTime="2026-03-20 11:13:14.46325719 +0000 UTC m=+1172.243862753" Mar 20 11:13:14 crc kubenswrapper[4695]: I0320 11:13:14.499792 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-njjcr" Mar 20 11:13:14 crc kubenswrapper[4695]: I0320 11:13:14.514326 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-hvnnc" Mar 20 11:13:24 crc kubenswrapper[4695]: I0320 11:13:24.675443 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-54mmr" Mar 20 11:13:38 crc kubenswrapper[4695]: I0320 11:13:38.430723 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:13:38 crc kubenswrapper[4695]: I0320 11:13:38.431630 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.491905 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lc6p9"] Mar 20 11:13:44 crc kubenswrapper[4695]: E0320 11:13:44.500349 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3aebe2-0698-4648-ac7c-eae261c6f8c1" containerName="oc" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.500399 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3aebe2-0698-4648-ac7c-eae261c6f8c1" containerName="oc" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.500549 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3aebe2-0698-4648-ac7c-eae261c6f8c1" containerName="oc" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.501278 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.506593 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-62p9w" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.506627 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.506825 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.517699 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.527312 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lc6p9"] Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.587049 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nvpmk"] Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.592830 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-config\") pod \"dnsmasq-dns-675f4bcbfc-lc6p9\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.592958 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76ffk\" (UniqueName: \"kubernetes.io/projected/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-kube-api-access-76ffk\") pod \"dnsmasq-dns-675f4bcbfc-lc6p9\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.594579 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.597346 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.613001 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nvpmk"] Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.694232 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76ffk\" (UniqueName: \"kubernetes.io/projected/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-kube-api-access-76ffk\") pod \"dnsmasq-dns-675f4bcbfc-lc6p9\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.694292 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0907202-d2f3-4249-805b-1fcb750f56af-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.694355 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfvlq\" (UniqueName: \"kubernetes.io/projected/c0907202-d2f3-4249-805b-1fcb750f56af-kube-api-access-xfvlq\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.694426 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0907202-d2f3-4249-805b-1fcb750f56af-config\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.694491 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-config\") pod \"dnsmasq-dns-675f4bcbfc-lc6p9\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.695935 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-config\") pod \"dnsmasq-dns-675f4bcbfc-lc6p9\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.726365 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76ffk\" (UniqueName: \"kubernetes.io/projected/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-kube-api-access-76ffk\") pod \"dnsmasq-dns-675f4bcbfc-lc6p9\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.796420 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfvlq\" (UniqueName: \"kubernetes.io/projected/c0907202-d2f3-4249-805b-1fcb750f56af-kube-api-access-xfvlq\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.796504 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0907202-d2f3-4249-805b-1fcb750f56af-config\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.796570 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0907202-d2f3-4249-805b-1fcb750f56af-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.797447 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0907202-d2f3-4249-805b-1fcb750f56af-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.797745 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0907202-d2f3-4249-805b-1fcb750f56af-config\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.816394 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfvlq\" (UniqueName: \"kubernetes.io/projected/c0907202-d2f3-4249-805b-1fcb750f56af-kube-api-access-xfvlq\") pod \"dnsmasq-dns-78dd6ddcc-nvpmk\" (UID: \"c0907202-d2f3-4249-805b-1fcb750f56af\") " pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.864013 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:13:44 crc kubenswrapper[4695]: I0320 11:13:44.927506 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:13:45 crc kubenswrapper[4695]: I0320 11:13:45.450106 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lc6p9"] Mar 20 11:13:45 crc kubenswrapper[4695]: I0320 11:13:45.528132 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-nvpmk"] Mar 20 11:13:45 crc kubenswrapper[4695]: W0320 11:13:45.532029 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0907202_d2f3_4249_805b_1fcb750f56af.slice/crio-f99d609e31f80ee2d42af8a6da6d18238742b6dfc15a30593d34304ea2b5f510 WatchSource:0}: Error finding container f99d609e31f80ee2d42af8a6da6d18238742b6dfc15a30593d34304ea2b5f510: Status 404 returned error can't find the container with id f99d609e31f80ee2d42af8a6da6d18238742b6dfc15a30593d34304ea2b5f510 Mar 20 11:13:46 crc kubenswrapper[4695]: I0320 11:13:46.133961 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" event={"ID":"d6a9b318-28de-4418-8cbe-fd01cc2aba1c","Type":"ContainerStarted","Data":"c74b8414df319dd82da0a63850b5acad6dc233160a0b0b09abc4da6cc657b881"} Mar 20 11:13:46 crc kubenswrapper[4695]: I0320 11:13:46.144426 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" event={"ID":"c0907202-d2f3-4249-805b-1fcb750f56af","Type":"ContainerStarted","Data":"f99d609e31f80ee2d42af8a6da6d18238742b6dfc15a30593d34304ea2b5f510"} Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.486546 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566754-q2pt7"] Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.488723 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.491491 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.495764 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-q2pt7"] Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.495990 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.496544 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.580826 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h6vt\" (UniqueName: \"kubernetes.io/projected/decc7126-f843-4c09-aead-83b7584f1a0f-kube-api-access-9h6vt\") pod \"auto-csr-approver-29566754-q2pt7\" (UID: \"decc7126-f843-4c09-aead-83b7584f1a0f\") " pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.681817 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9h6vt\" (UniqueName: \"kubernetes.io/projected/decc7126-f843-4c09-aead-83b7584f1a0f-kube-api-access-9h6vt\") pod \"auto-csr-approver-29566754-q2pt7\" (UID: \"decc7126-f843-4c09-aead-83b7584f1a0f\") " pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.704055 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h6vt\" (UniqueName: \"kubernetes.io/projected/decc7126-f843-4c09-aead-83b7584f1a0f-kube-api-access-9h6vt\") pod \"auto-csr-approver-29566754-q2pt7\" (UID: \"decc7126-f843-4c09-aead-83b7584f1a0f\") " pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:00 crc kubenswrapper[4695]: I0320 11:14:00.825617 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:08 crc kubenswrapper[4695]: I0320 11:14:08.440448 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:14:08 crc kubenswrapper[4695]: I0320 11:14:08.441241 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:14:08 crc kubenswrapper[4695]: I0320 11:14:08.441297 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:14:08 crc kubenswrapper[4695]: I0320 11:14:08.441891 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cceac66b33a60ba76fb29486822ed6970274dda5bcbe64eb92732cba195eadd4"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:14:08 crc kubenswrapper[4695]: I0320 11:14:08.441966 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://cceac66b33a60ba76fb29486822ed6970274dda5bcbe64eb92732cba195eadd4" gracePeriod=600 Mar 20 11:14:11 crc kubenswrapper[4695]: I0320 11:14:11.172225 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="cceac66b33a60ba76fb29486822ed6970274dda5bcbe64eb92732cba195eadd4" exitCode=0 Mar 20 11:14:11 crc kubenswrapper[4695]: I0320 11:14:11.174040 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"cceac66b33a60ba76fb29486822ed6970274dda5bcbe64eb92732cba195eadd4"} Mar 20 11:14:11 crc kubenswrapper[4695]: I0320 11:14:11.174157 4695 scope.go:117] "RemoveContainer" containerID="b7f145b88a381dab0a3af9969335c65e981dd0f3a0b106c999ecbed0c035eef5" Mar 20 11:14:12 crc kubenswrapper[4695]: E0320 11:14:12.944951 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:14:12 crc kubenswrapper[4695]: E0320 11:14:12.945781 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xfvlq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-nvpmk_openstack(c0907202-d2f3-4249-805b-1fcb750f56af): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:12 crc kubenswrapper[4695]: E0320 11:14:12.947015 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" podUID="c0907202-d2f3-4249-805b-1fcb750f56af" Mar 20 11:14:13 crc kubenswrapper[4695]: I0320 11:14:13.094622 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-q2pt7"] Mar 20 11:14:13 crc kubenswrapper[4695]: W0320 11:14:13.097097 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddecc7126_f843_4c09_aead_83b7584f1a0f.slice/crio-9cb7326d055ca8a0fb0c715deeb6b17188df77d1799d3204a8345407afbbd6ab WatchSource:0}: Error finding container 9cb7326d055ca8a0fb0c715deeb6b17188df77d1799d3204a8345407afbbd6ab: Status 404 returned error can't find the container with id 9cb7326d055ca8a0fb0c715deeb6b17188df77d1799d3204a8345407afbbd6ab Mar 20 11:14:13 crc kubenswrapper[4695]: I0320 11:14:13.100208 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:14:13 crc kubenswrapper[4695]: I0320 11:14:13.621263 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" event={"ID":"decc7126-f843-4c09-aead-83b7584f1a0f","Type":"ContainerStarted","Data":"9cb7326d055ca8a0fb0c715deeb6b17188df77d1799d3204a8345407afbbd6ab"} Mar 20 11:14:13 crc kubenswrapper[4695]: I0320 11:14:13.625047 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"8bc9cc8c349e0d536cb6c63da318c52d0806648b595c149581e7bb85b97cf1a4"} Mar 20 11:14:13 crc kubenswrapper[4695]: E0320 11:14:13.626220 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" podUID="c0907202-d2f3-4249-805b-1fcb750f56af" Mar 20 11:14:13 crc kubenswrapper[4695]: E0320 11:14:13.642692 4695 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:14:13 crc kubenswrapper[4695]: E0320 11:14:13.642886 4695 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76ffk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-lc6p9_openstack(d6a9b318-28de-4418-8cbe-fd01cc2aba1c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:14:13 crc kubenswrapper[4695]: E0320 11:14:13.644549 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" Mar 20 11:14:14 crc kubenswrapper[4695]: E0320 11:14:14.632373 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" Mar 20 11:14:17 crc kubenswrapper[4695]: I0320 11:14:17.704623 4695 generic.go:334] "Generic (PLEG): container finished" podID="decc7126-f843-4c09-aead-83b7584f1a0f" containerID="2bf15c95d48ed2fb30c6ae5cd2bad947d1425285e80ae9ee1d5ea3818a59455b" exitCode=0 Mar 20 11:14:17 crc kubenswrapper[4695]: I0320 11:14:17.704679 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" event={"ID":"decc7126-f843-4c09-aead-83b7584f1a0f","Type":"ContainerDied","Data":"2bf15c95d48ed2fb30c6ae5cd2bad947d1425285e80ae9ee1d5ea3818a59455b"} Mar 20 11:14:21 crc kubenswrapper[4695]: I0320 11:14:21.359033 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:22 crc kubenswrapper[4695]: I0320 11:14:22.788415 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" event={"ID":"decc7126-f843-4c09-aead-83b7584f1a0f","Type":"ContainerDied","Data":"9cb7326d055ca8a0fb0c715deeb6b17188df77d1799d3204a8345407afbbd6ab"} Mar 20 11:14:22 crc kubenswrapper[4695]: I0320 11:14:22.788802 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb7326d055ca8a0fb0c715deeb6b17188df77d1799d3204a8345407afbbd6ab" Mar 20 11:14:22 crc kubenswrapper[4695]: I0320 11:14:22.788856 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-q2pt7" Mar 20 11:14:22 crc kubenswrapper[4695]: I0320 11:14:22.799890 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h6vt\" (UniqueName: \"kubernetes.io/projected/decc7126-f843-4c09-aead-83b7584f1a0f-kube-api-access-9h6vt\") pod \"decc7126-f843-4c09-aead-83b7584f1a0f\" (UID: \"decc7126-f843-4c09-aead-83b7584f1a0f\") " Mar 20 11:14:22 crc kubenswrapper[4695]: I0320 11:14:22.807461 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/decc7126-f843-4c09-aead-83b7584f1a0f-kube-api-access-9h6vt" (OuterVolumeSpecName: "kube-api-access-9h6vt") pod "decc7126-f843-4c09-aead-83b7584f1a0f" (UID: "decc7126-f843-4c09-aead-83b7584f1a0f"). InnerVolumeSpecName "kube-api-access-9h6vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:22 crc kubenswrapper[4695]: I0320 11:14:22.903937 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9h6vt\" (UniqueName: \"kubernetes.io/projected/decc7126-f843-4c09-aead-83b7584f1a0f-kube-api-access-9h6vt\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:23 crc kubenswrapper[4695]: I0320 11:14:23.924244 4695 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-g8s68" podUID="a6a059b0-61e0-4592-8661-e480f9573c66" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.61:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 11:14:24 crc kubenswrapper[4695]: I0320 11:14:24.008181 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-jlvx2"] Mar 20 11:14:24 crc kubenswrapper[4695]: I0320 11:14:24.016201 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-jlvx2"] Mar 20 11:14:24 crc kubenswrapper[4695]: I0320 11:14:24.901653 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f15d73b7-72cb-4cee-8019-e27bf7c0e646" path="/var/lib/kubelet/pods/f15d73b7-72cb-4cee-8019-e27bf7c0e646/volumes" Mar 20 11:14:25 crc kubenswrapper[4695]: I0320 11:14:25.963805 4695 generic.go:334] "Generic (PLEG): container finished" podID="c0907202-d2f3-4249-805b-1fcb750f56af" containerID="5a6377d04564441aef9a763d0b0870426bfac40cc40d3114d488c69ea43e8336" exitCode=0 Mar 20 11:14:25 crc kubenswrapper[4695]: I0320 11:14:25.963891 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" event={"ID":"c0907202-d2f3-4249-805b-1fcb750f56af","Type":"ContainerDied","Data":"5a6377d04564441aef9a763d0b0870426bfac40cc40d3114d488c69ea43e8336"} Mar 20 11:14:26 crc kubenswrapper[4695]: I0320 11:14:26.973136 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" event={"ID":"c0907202-d2f3-4249-805b-1fcb750f56af","Type":"ContainerStarted","Data":"f8015cc8d81f1c74a5e554cad94a440a9c6cce8499c575d70263ec8c6a4f6a7e"} Mar 20 11:14:26 crc kubenswrapper[4695]: I0320 11:14:26.973537 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:14:27 crc kubenswrapper[4695]: I0320 11:14:27.008808 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" podStartSLOduration=3.122648678 podStartE2EDuration="43.008769978s" podCreationTimestamp="2026-03-20 11:13:44 +0000 UTC" firstStartedPulling="2026-03-20 11:13:45.534403365 +0000 UTC m=+1203.315008918" lastFinishedPulling="2026-03-20 11:14:25.420524645 +0000 UTC m=+1243.201130218" observedRunningTime="2026-03-20 11:14:26.998183149 +0000 UTC m=+1244.778788712" watchObservedRunningTime="2026-03-20 11:14:27.008769978 +0000 UTC m=+1244.789375541" Mar 20 11:14:31 crc kubenswrapper[4695]: I0320 11:14:31.552005 4695 generic.go:334] "Generic (PLEG): container finished" podID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerID="3862db532df280bbfb18117bc773282f4790d9b45e1b655af1e56e2ed8aa18ad" exitCode=0 Mar 20 11:14:31 crc kubenswrapper[4695]: I0320 11:14:31.552090 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" event={"ID":"d6a9b318-28de-4418-8cbe-fd01cc2aba1c","Type":"ContainerDied","Data":"3862db532df280bbfb18117bc773282f4790d9b45e1b655af1e56e2ed8aa18ad"} Mar 20 11:14:32 crc kubenswrapper[4695]: I0320 11:14:32.569804 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" event={"ID":"d6a9b318-28de-4418-8cbe-fd01cc2aba1c","Type":"ContainerStarted","Data":"cd1658b99518a4046f69e49c0722073807e77907c8da2b9a8148d9d500ddd3cd"} Mar 20 11:14:32 crc kubenswrapper[4695]: I0320 11:14:32.572076 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:14:34 crc kubenswrapper[4695]: I0320 11:14:34.929089 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dd6ddcc-nvpmk" Mar 20 11:14:34 crc kubenswrapper[4695]: I0320 11:14:34.953604 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" podStartSLOduration=-9223371985.901194 podStartE2EDuration="50.953582172s" podCreationTimestamp="2026-03-20 11:13:44 +0000 UTC" firstStartedPulling="2026-03-20 11:13:45.463260528 +0000 UTC m=+1203.243866081" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:14:32.59289576 +0000 UTC m=+1250.373501353" watchObservedRunningTime="2026-03-20 11:14:34.953582172 +0000 UTC m=+1252.734187735" Mar 20 11:14:34 crc kubenswrapper[4695]: I0320 11:14:34.987862 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lc6p9"] Mar 20 11:14:34 crc kubenswrapper[4695]: I0320 11:14:34.988150 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerName="dnsmasq-dns" containerID="cri-o://cd1658b99518a4046f69e49c0722073807e77907c8da2b9a8148d9d500ddd3cd" gracePeriod=10 Mar 20 11:14:35 crc kubenswrapper[4695]: I0320 11:14:35.692892 4695 generic.go:334] "Generic (PLEG): container finished" podID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerID="cd1658b99518a4046f69e49c0722073807e77907c8da2b9a8148d9d500ddd3cd" exitCode=0 Mar 20 11:14:35 crc kubenswrapper[4695]: I0320 11:14:35.693062 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" event={"ID":"d6a9b318-28de-4418-8cbe-fd01cc2aba1c","Type":"ContainerDied","Data":"cd1658b99518a4046f69e49c0722073807e77907c8da2b9a8148d9d500ddd3cd"} Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.033476 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.207336 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-config\") pod \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.207803 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76ffk\" (UniqueName: \"kubernetes.io/projected/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-kube-api-access-76ffk\") pod \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\" (UID: \"d6a9b318-28de-4418-8cbe-fd01cc2aba1c\") " Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.215756 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-kube-api-access-76ffk" (OuterVolumeSpecName: "kube-api-access-76ffk") pod "d6a9b318-28de-4418-8cbe-fd01cc2aba1c" (UID: "d6a9b318-28de-4418-8cbe-fd01cc2aba1c"). InnerVolumeSpecName "kube-api-access-76ffk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.309349 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76ffk\" (UniqueName: \"kubernetes.io/projected/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-kube-api-access-76ffk\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.638215 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-config" (OuterVolumeSpecName: "config") pod "d6a9b318-28de-4418-8cbe-fd01cc2aba1c" (UID: "d6a9b318-28de-4418-8cbe-fd01cc2aba1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.677322 4695 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6a9b318-28de-4418-8cbe-fd01cc2aba1c-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.708219 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" event={"ID":"d6a9b318-28de-4418-8cbe-fd01cc2aba1c","Type":"ContainerDied","Data":"c74b8414df319dd82da0a63850b5acad6dc233160a0b0b09abc4da6cc657b881"} Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.708296 4695 scope.go:117] "RemoveContainer" containerID="cd1658b99518a4046f69e49c0722073807e77907c8da2b9a8148d9d500ddd3cd" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.708520 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-lc6p9" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.749459 4695 scope.go:117] "RemoveContainer" containerID="3862db532df280bbfb18117bc773282f4790d9b45e1b655af1e56e2ed8aa18ad" Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.768026 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lc6p9"] Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.778424 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-lc6p9"] Mar 20 11:14:36 crc kubenswrapper[4695]: I0320 11:14:36.900880 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" path="/var/lib/kubelet/pods/d6a9b318-28de-4418-8cbe-fd01cc2aba1c/volumes" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.159200 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h"] Mar 20 11:15:00 crc kubenswrapper[4695]: E0320 11:15:00.162851 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerName="dnsmasq-dns" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.162896 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerName="dnsmasq-dns" Mar 20 11:15:00 crc kubenswrapper[4695]: E0320 11:15:00.162946 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerName="init" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.162955 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerName="init" Mar 20 11:15:00 crc kubenswrapper[4695]: E0320 11:15:00.162981 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="decc7126-f843-4c09-aead-83b7584f1a0f" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.162990 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="decc7126-f843-4c09-aead-83b7584f1a0f" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.163257 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="decc7126-f843-4c09-aead-83b7584f1a0f" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.163291 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6a9b318-28de-4418-8cbe-fd01cc2aba1c" containerName="dnsmasq-dns" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.164427 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.168067 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.168174 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.169032 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h"] Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.251560 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhgzg\" (UniqueName: \"kubernetes.io/projected/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-kube-api-access-xhgzg\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.251694 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-config-volume\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.251740 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-secret-volume\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.353483 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-secret-volume\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.353572 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhgzg\" (UniqueName: \"kubernetes.io/projected/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-kube-api-access-xhgzg\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.353685 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-config-volume\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.354730 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-config-volume\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.362158 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-secret-volume\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.372875 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhgzg\" (UniqueName: \"kubernetes.io/projected/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-kube-api-access-xhgzg\") pod \"collect-profiles-29566755-7db6h\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.487367 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.962020 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h"] Mar 20 11:15:00 crc kubenswrapper[4695]: I0320 11:15:00.976859 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" event={"ID":"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001","Type":"ContainerStarted","Data":"3fc7f112678759d6c4220c6a5b1f7fe6fedb84bcce719911a34d6d739a571e27"} Mar 20 11:15:01 crc kubenswrapper[4695]: I0320 11:15:01.988454 4695 generic.go:334] "Generic (PLEG): container finished" podID="54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" containerID="04f7427c288977e9a8d216e2e64cb2f793c94eace9d8655c31e60cc40796c9b8" exitCode=0 Mar 20 11:15:01 crc kubenswrapper[4695]: I0320 11:15:01.988576 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" event={"ID":"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001","Type":"ContainerDied","Data":"04f7427c288977e9a8d216e2e64cb2f793c94eace9d8655c31e60cc40796c9b8"} Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.247626 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.307567 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-config-volume\") pod \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.307644 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhgzg\" (UniqueName: \"kubernetes.io/projected/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-kube-api-access-xhgzg\") pod \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.307921 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-secret-volume\") pod \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\" (UID: \"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001\") " Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.308794 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-config-volume" (OuterVolumeSpecName: "config-volume") pod "54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" (UID: "54bdc330-f0f8-4bd3-b0bb-bd3a4056c001"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.313770 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" (UID: "54bdc330-f0f8-4bd3-b0bb-bd3a4056c001"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.313823 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-kube-api-access-xhgzg" (OuterVolumeSpecName: "kube-api-access-xhgzg") pod "54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" (UID: "54bdc330-f0f8-4bd3-b0bb-bd3a4056c001"). InnerVolumeSpecName "kube-api-access-xhgzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.410111 4695 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.410154 4695 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:03 crc kubenswrapper[4695]: I0320 11:15:03.410182 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhgzg\" (UniqueName: \"kubernetes.io/projected/54bdc330-f0f8-4bd3-b0bb-bd3a4056c001-kube-api-access-xhgzg\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:04 crc kubenswrapper[4695]: I0320 11:15:04.007501 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" event={"ID":"54bdc330-f0f8-4bd3-b0bb-bd3a4056c001","Type":"ContainerDied","Data":"3fc7f112678759d6c4220c6a5b1f7fe6fedb84bcce719911a34d6d739a571e27"} Mar 20 11:15:04 crc kubenswrapper[4695]: I0320 11:15:04.007552 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-7db6h" Mar 20 11:15:04 crc kubenswrapper[4695]: I0320 11:15:04.007552 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fc7f112678759d6c4220c6a5b1f7fe6fedb84bcce719911a34d6d739a571e27" Mar 20 11:15:05 crc kubenswrapper[4695]: I0320 11:15:05.678607 4695 scope.go:117] "RemoveContainer" containerID="8c51d9c542e5473472702678a9eb944d20800febece8e2918120e4a01e93b127" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.158457 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566756-l5sts"] Mar 20 11:16:00 crc kubenswrapper[4695]: E0320 11:16:00.160363 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" containerName="collect-profiles" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.160382 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" containerName="collect-profiles" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.160534 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="54bdc330-f0f8-4bd3-b0bb-bd3a4056c001" containerName="collect-profiles" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.161115 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.165428 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.169533 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.169827 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.170673 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-l5sts"] Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.324960 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm5bx\" (UniqueName: \"kubernetes.io/projected/596046b2-828a-4fb3-b57f-db82066f115e-kube-api-access-tm5bx\") pod \"auto-csr-approver-29566756-l5sts\" (UID: \"596046b2-828a-4fb3-b57f-db82066f115e\") " pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.427524 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm5bx\" (UniqueName: \"kubernetes.io/projected/596046b2-828a-4fb3-b57f-db82066f115e-kube-api-access-tm5bx\") pod \"auto-csr-approver-29566756-l5sts\" (UID: \"596046b2-828a-4fb3-b57f-db82066f115e\") " pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.453328 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm5bx\" (UniqueName: \"kubernetes.io/projected/596046b2-828a-4fb3-b57f-db82066f115e-kube-api-access-tm5bx\") pod \"auto-csr-approver-29566756-l5sts\" (UID: \"596046b2-828a-4fb3-b57f-db82066f115e\") " pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.493368 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:00 crc kubenswrapper[4695]: I0320 11:16:00.954846 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-l5sts"] Mar 20 11:16:01 crc kubenswrapper[4695]: I0320 11:16:01.920215 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-l5sts" event={"ID":"596046b2-828a-4fb3-b57f-db82066f115e","Type":"ContainerStarted","Data":"b35cd5f5a7e683c65e0ce01022c33d229916f4a283029290f738d3f8f25287a6"} Mar 20 11:16:02 crc kubenswrapper[4695]: I0320 11:16:02.933157 4695 generic.go:334] "Generic (PLEG): container finished" podID="596046b2-828a-4fb3-b57f-db82066f115e" containerID="fbc5ef08a27a876cba3befb5d973447021cc6182e120c230541491c63c957df7" exitCode=0 Mar 20 11:16:02 crc kubenswrapper[4695]: I0320 11:16:02.933450 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-l5sts" event={"ID":"596046b2-828a-4fb3-b57f-db82066f115e","Type":"ContainerDied","Data":"fbc5ef08a27a876cba3befb5d973447021cc6182e120c230541491c63c957df7"} Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.243413 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.420152 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm5bx\" (UniqueName: \"kubernetes.io/projected/596046b2-828a-4fb3-b57f-db82066f115e-kube-api-access-tm5bx\") pod \"596046b2-828a-4fb3-b57f-db82066f115e\" (UID: \"596046b2-828a-4fb3-b57f-db82066f115e\") " Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.431037 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596046b2-828a-4fb3-b57f-db82066f115e-kube-api-access-tm5bx" (OuterVolumeSpecName: "kube-api-access-tm5bx") pod "596046b2-828a-4fb3-b57f-db82066f115e" (UID: "596046b2-828a-4fb3-b57f-db82066f115e"). InnerVolumeSpecName "kube-api-access-tm5bx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.521980 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm5bx\" (UniqueName: \"kubernetes.io/projected/596046b2-828a-4fb3-b57f-db82066f115e-kube-api-access-tm5bx\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.954028 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-l5sts" event={"ID":"596046b2-828a-4fb3-b57f-db82066f115e","Type":"ContainerDied","Data":"b35cd5f5a7e683c65e0ce01022c33d229916f4a283029290f738d3f8f25287a6"} Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.954375 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b35cd5f5a7e683c65e0ce01022c33d229916f4a283029290f738d3f8f25287a6" Mar 20 11:16:04 crc kubenswrapper[4695]: I0320 11:16:04.954113 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-l5sts" Mar 20 11:16:05 crc kubenswrapper[4695]: I0320 11:16:05.320978 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-8w627"] Mar 20 11:16:05 crc kubenswrapper[4695]: I0320 11:16:05.326533 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-8w627"] Mar 20 11:16:06 crc kubenswrapper[4695]: I0320 11:16:06.896744 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7bded3b-99e6-4985-8928-f58fc6203a14" path="/var/lib/kubelet/pods/b7bded3b-99e6-4985-8928-f58fc6203a14/volumes" Mar 20 11:16:38 crc kubenswrapper[4695]: I0320 11:16:38.431438 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:16:38 crc kubenswrapper[4695]: I0320 11:16:38.432302 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:05 crc kubenswrapper[4695]: I0320 11:17:05.775805 4695 scope.go:117] "RemoveContainer" containerID="86eca8eb6dc04e2d8ed1b50291cf7501083ecbb722c7cd38c692c89ce9518d61" Mar 20 11:17:08 crc kubenswrapper[4695]: I0320 11:17:08.432358 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:08 crc kubenswrapper[4695]: I0320 11:17:08.432758 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.430921 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.431483 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.431545 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.432081 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8bc9cc8c349e0d536cb6c63da318c52d0806648b595c149581e7bb85b97cf1a4"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.432150 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://8bc9cc8c349e0d536cb6c63da318c52d0806648b595c149581e7bb85b97cf1a4" gracePeriod=600 Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.992820 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="8bc9cc8c349e0d536cb6c63da318c52d0806648b595c149581e7bb85b97cf1a4" exitCode=0 Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.992918 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"8bc9cc8c349e0d536cb6c63da318c52d0806648b595c149581e7bb85b97cf1a4"} Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.993372 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a"} Mar 20 11:17:38 crc kubenswrapper[4695]: I0320 11:17:38.993406 4695 scope.go:117] "RemoveContainer" containerID="cceac66b33a60ba76fb29486822ed6970274dda5bcbe64eb92732cba195eadd4" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.150992 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566758-wgd6c"] Mar 20 11:18:00 crc kubenswrapper[4695]: E0320 11:18:00.153763 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596046b2-828a-4fb3-b57f-db82066f115e" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.153898 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="596046b2-828a-4fb3-b57f-db82066f115e" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.154260 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="596046b2-828a-4fb3-b57f-db82066f115e" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.155123 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.157804 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.158286 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.159177 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-wgd6c"] Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.161946 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.275241 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkjff\" (UniqueName: \"kubernetes.io/projected/5b2e67f3-2381-404e-95e8-3fdeb41852ee-kube-api-access-wkjff\") pod \"auto-csr-approver-29566758-wgd6c\" (UID: \"5b2e67f3-2381-404e-95e8-3fdeb41852ee\") " pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.377526 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkjff\" (UniqueName: \"kubernetes.io/projected/5b2e67f3-2381-404e-95e8-3fdeb41852ee-kube-api-access-wkjff\") pod \"auto-csr-approver-29566758-wgd6c\" (UID: \"5b2e67f3-2381-404e-95e8-3fdeb41852ee\") " pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.401459 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkjff\" (UniqueName: \"kubernetes.io/projected/5b2e67f3-2381-404e-95e8-3fdeb41852ee-kube-api-access-wkjff\") pod \"auto-csr-approver-29566758-wgd6c\" (UID: \"5b2e67f3-2381-404e-95e8-3fdeb41852ee\") " pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.476961 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:00 crc kubenswrapper[4695]: I0320 11:18:00.958274 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-wgd6c"] Mar 20 11:18:01 crc kubenswrapper[4695]: I0320 11:18:01.181330 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" event={"ID":"5b2e67f3-2381-404e-95e8-3fdeb41852ee","Type":"ContainerStarted","Data":"0139e07cff81c7128f27138586179c4adec84810d09aecdc266042f1d5422315"} Mar 20 11:18:02 crc kubenswrapper[4695]: I0320 11:18:02.190237 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" event={"ID":"5b2e67f3-2381-404e-95e8-3fdeb41852ee","Type":"ContainerStarted","Data":"bb86d9183473df9c2977058fcef2b472677a69bf62496a75f5dce79f19b60bec"} Mar 20 11:18:02 crc kubenswrapper[4695]: I0320 11:18:02.208957 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" podStartSLOduration=1.291451496 podStartE2EDuration="2.208910244s" podCreationTimestamp="2026-03-20 11:18:00 +0000 UTC" firstStartedPulling="2026-03-20 11:18:00.960707479 +0000 UTC m=+1458.741313062" lastFinishedPulling="2026-03-20 11:18:01.878166247 +0000 UTC m=+1459.658771810" observedRunningTime="2026-03-20 11:18:02.207299753 +0000 UTC m=+1459.987905316" watchObservedRunningTime="2026-03-20 11:18:02.208910244 +0000 UTC m=+1459.989515827" Mar 20 11:18:03 crc kubenswrapper[4695]: I0320 11:18:03.200792 4695 generic.go:334] "Generic (PLEG): container finished" podID="5b2e67f3-2381-404e-95e8-3fdeb41852ee" containerID="bb86d9183473df9c2977058fcef2b472677a69bf62496a75f5dce79f19b60bec" exitCode=0 Mar 20 11:18:03 crc kubenswrapper[4695]: I0320 11:18:03.200843 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" event={"ID":"5b2e67f3-2381-404e-95e8-3fdeb41852ee","Type":"ContainerDied","Data":"bb86d9183473df9c2977058fcef2b472677a69bf62496a75f5dce79f19b60bec"} Mar 20 11:18:04 crc kubenswrapper[4695]: I0320 11:18:04.494961 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:04 crc kubenswrapper[4695]: I0320 11:18:04.658411 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkjff\" (UniqueName: \"kubernetes.io/projected/5b2e67f3-2381-404e-95e8-3fdeb41852ee-kube-api-access-wkjff\") pod \"5b2e67f3-2381-404e-95e8-3fdeb41852ee\" (UID: \"5b2e67f3-2381-404e-95e8-3fdeb41852ee\") " Mar 20 11:18:04 crc kubenswrapper[4695]: I0320 11:18:04.667203 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b2e67f3-2381-404e-95e8-3fdeb41852ee-kube-api-access-wkjff" (OuterVolumeSpecName: "kube-api-access-wkjff") pod "5b2e67f3-2381-404e-95e8-3fdeb41852ee" (UID: "5b2e67f3-2381-404e-95e8-3fdeb41852ee"). InnerVolumeSpecName "kube-api-access-wkjff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:18:04 crc kubenswrapper[4695]: I0320 11:18:04.760828 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkjff\" (UniqueName: \"kubernetes.io/projected/5b2e67f3-2381-404e-95e8-3fdeb41852ee-kube-api-access-wkjff\") on node \"crc\" DevicePath \"\"" Mar 20 11:18:05 crc kubenswrapper[4695]: I0320 11:18:05.217367 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" event={"ID":"5b2e67f3-2381-404e-95e8-3fdeb41852ee","Type":"ContainerDied","Data":"0139e07cff81c7128f27138586179c4adec84810d09aecdc266042f1d5422315"} Mar 20 11:18:05 crc kubenswrapper[4695]: I0320 11:18:05.217789 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0139e07cff81c7128f27138586179c4adec84810d09aecdc266042f1d5422315" Mar 20 11:18:05 crc kubenswrapper[4695]: I0320 11:18:05.217426 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-wgd6c" Mar 20 11:18:05 crc kubenswrapper[4695]: I0320 11:18:05.295819 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-ws9vl"] Mar 20 11:18:05 crc kubenswrapper[4695]: I0320 11:18:05.300971 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-ws9vl"] Mar 20 11:18:06 crc kubenswrapper[4695]: I0320 11:18:06.897400 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df3aebe2-0698-4648-ac7c-eae261c6f8c1" path="/var/lib/kubelet/pods/df3aebe2-0698-4648-ac7c-eae261c6f8c1/volumes" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.578397 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pxfqk"] Mar 20 11:18:45 crc kubenswrapper[4695]: E0320 11:18:45.579611 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b2e67f3-2381-404e-95e8-3fdeb41852ee" containerName="oc" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.579629 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b2e67f3-2381-404e-95e8-3fdeb41852ee" containerName="oc" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.579827 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b2e67f3-2381-404e-95e8-3fdeb41852ee" containerName="oc" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.581172 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.607961 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxfqk"] Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.777906 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-utilities\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.777979 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-catalog-content\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.778055 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9xrb\" (UniqueName: \"kubernetes.io/projected/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-kube-api-access-w9xrb\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.879537 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-utilities\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.879614 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-catalog-content\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.879655 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9xrb\" (UniqueName: \"kubernetes.io/projected/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-kube-api-access-w9xrb\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.880467 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-utilities\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.880649 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-catalog-content\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:45 crc kubenswrapper[4695]: I0320 11:18:45.902049 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9xrb\" (UniqueName: \"kubernetes.io/projected/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-kube-api-access-w9xrb\") pod \"redhat-operators-pxfqk\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:46 crc kubenswrapper[4695]: I0320 11:18:46.197398 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:46 crc kubenswrapper[4695]: I0320 11:18:46.685892 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pxfqk"] Mar 20 11:18:46 crc kubenswrapper[4695]: I0320 11:18:46.784933 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerStarted","Data":"7da10cabc9f7adcddb6bb9b78eeb8e4f212010ede5d4b9474368856687af9dff"} Mar 20 11:18:47 crc kubenswrapper[4695]: I0320 11:18:47.803363 4695 generic.go:334] "Generic (PLEG): container finished" podID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerID="d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4" exitCode=0 Mar 20 11:18:47 crc kubenswrapper[4695]: I0320 11:18:47.803483 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerDied","Data":"d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4"} Mar 20 11:18:48 crc kubenswrapper[4695]: I0320 11:18:48.821882 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerStarted","Data":"1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34"} Mar 20 11:18:50 crc kubenswrapper[4695]: I0320 11:18:50.839578 4695 generic.go:334] "Generic (PLEG): container finished" podID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerID="1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34" exitCode=0 Mar 20 11:18:50 crc kubenswrapper[4695]: I0320 11:18:50.839656 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerDied","Data":"1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34"} Mar 20 11:18:51 crc kubenswrapper[4695]: I0320 11:18:51.849411 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerStarted","Data":"70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91"} Mar 20 11:18:51 crc kubenswrapper[4695]: I0320 11:18:51.868225 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pxfqk" podStartSLOduration=3.355793854 podStartE2EDuration="6.868202357s" podCreationTimestamp="2026-03-20 11:18:45 +0000 UTC" firstStartedPulling="2026-03-20 11:18:47.806142736 +0000 UTC m=+1505.586748299" lastFinishedPulling="2026-03-20 11:18:51.318551239 +0000 UTC m=+1509.099156802" observedRunningTime="2026-03-20 11:18:51.86713302 +0000 UTC m=+1509.647738583" watchObservedRunningTime="2026-03-20 11:18:51.868202357 +0000 UTC m=+1509.648807920" Mar 20 11:18:56 crc kubenswrapper[4695]: I0320 11:18:56.198326 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:56 crc kubenswrapper[4695]: I0320 11:18:56.198871 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:18:57 crc kubenswrapper[4695]: I0320 11:18:57.246077 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pxfqk" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="registry-server" probeResult="failure" output=< Mar 20 11:18:57 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 11:18:57 crc kubenswrapper[4695]: > Mar 20 11:19:05 crc kubenswrapper[4695]: I0320 11:19:05.867930 4695 scope.go:117] "RemoveContainer" containerID="6d18f6f009b8b5cc1f2e51458fdd68cd5989eebbe5be8f69feb25f4cbb4809dc" Mar 20 11:19:06 crc kubenswrapper[4695]: I0320 11:19:06.243765 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:19:06 crc kubenswrapper[4695]: I0320 11:19:06.290633 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:19:06 crc kubenswrapper[4695]: I0320 11:19:06.480573 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxfqk"] Mar 20 11:19:07 crc kubenswrapper[4695]: I0320 11:19:07.311669 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pxfqk" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="registry-server" containerID="cri-o://70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91" gracePeriod=2 Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.294975 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.326803 4695 generic.go:334] "Generic (PLEG): container finished" podID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerID="70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91" exitCode=0 Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.326878 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pxfqk" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.326867 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerDied","Data":"70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91"} Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.327841 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pxfqk" event={"ID":"7cf26d9e-0a8f-4625-a88a-d70acc10ef04","Type":"ContainerDied","Data":"7da10cabc9f7adcddb6bb9b78eeb8e4f212010ede5d4b9474368856687af9dff"} Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.327864 4695 scope.go:117] "RemoveContainer" containerID="70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.345192 4695 scope.go:117] "RemoveContainer" containerID="1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.367662 4695 scope.go:117] "RemoveContainer" containerID="d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.393652 4695 scope.go:117] "RemoveContainer" containerID="70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91" Mar 20 11:19:08 crc kubenswrapper[4695]: E0320 11:19:08.394321 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91\": container with ID starting with 70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91 not found: ID does not exist" containerID="70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.394428 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91"} err="failed to get container status \"70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91\": rpc error: code = NotFound desc = could not find container \"70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91\": container with ID starting with 70a6803966003f79ae7f19e74adf8ac3a2aeaa4bd309ccd5a44884661b399b91 not found: ID does not exist" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.394462 4695 scope.go:117] "RemoveContainer" containerID="1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34" Mar 20 11:19:08 crc kubenswrapper[4695]: E0320 11:19:08.394821 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34\": container with ID starting with 1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34 not found: ID does not exist" containerID="1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.394847 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34"} err="failed to get container status \"1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34\": rpc error: code = NotFound desc = could not find container \"1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34\": container with ID starting with 1fc4e11a57641e7c8c62d38522630277da051bca01ce512c2d15a60362c3cf34 not found: ID does not exist" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.394901 4695 scope.go:117] "RemoveContainer" containerID="d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4" Mar 20 11:19:08 crc kubenswrapper[4695]: E0320 11:19:08.395258 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4\": container with ID starting with d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4 not found: ID does not exist" containerID="d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.396097 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4"} err="failed to get container status \"d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4\": rpc error: code = NotFound desc = could not find container \"d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4\": container with ID starting with d2fe7fbe8b482888bb826a26c567c46765c37bfe0be47da2213820821d7a17e4 not found: ID does not exist" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.479366 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-catalog-content\") pod \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.479547 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9xrb\" (UniqueName: \"kubernetes.io/projected/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-kube-api-access-w9xrb\") pod \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.479641 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-utilities\") pod \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\" (UID: \"7cf26d9e-0a8f-4625-a88a-d70acc10ef04\") " Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.480998 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-utilities" (OuterVolumeSpecName: "utilities") pod "7cf26d9e-0a8f-4625-a88a-d70acc10ef04" (UID: "7cf26d9e-0a8f-4625-a88a-d70acc10ef04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.487151 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-kube-api-access-w9xrb" (OuterVolumeSpecName: "kube-api-access-w9xrb") pod "7cf26d9e-0a8f-4625-a88a-d70acc10ef04" (UID: "7cf26d9e-0a8f-4625-a88a-d70acc10ef04"). InnerVolumeSpecName "kube-api-access-w9xrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.581039 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.581094 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9xrb\" (UniqueName: \"kubernetes.io/projected/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-kube-api-access-w9xrb\") on node \"crc\" DevicePath \"\"" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.604556 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cf26d9e-0a8f-4625-a88a-d70acc10ef04" (UID: "7cf26d9e-0a8f-4625-a88a-d70acc10ef04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.664587 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pxfqk"] Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.671375 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pxfqk"] Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.682775 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cf26d9e-0a8f-4625-a88a-d70acc10ef04-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:19:08 crc kubenswrapper[4695]: I0320 11:19:08.898816 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" path="/var/lib/kubelet/pods/7cf26d9e-0a8f-4625-a88a-d70acc10ef04/volumes" Mar 20 11:19:38 crc kubenswrapper[4695]: I0320 11:19:38.431132 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:19:38 crc kubenswrapper[4695]: I0320 11:19:38.431892 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.143085 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566760-dp5wf"] Mar 20 11:20:00 crc kubenswrapper[4695]: E0320 11:20:00.144127 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="extract-utilities" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.144146 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="extract-utilities" Mar 20 11:20:00 crc kubenswrapper[4695]: E0320 11:20:00.144172 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="registry-server" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.144180 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="registry-server" Mar 20 11:20:00 crc kubenswrapper[4695]: E0320 11:20:00.144198 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="extract-content" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.144205 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="extract-content" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.144367 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cf26d9e-0a8f-4625-a88a-d70acc10ef04" containerName="registry-server" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.144929 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.147671 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.147672 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.152016 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.152832 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-dp5wf"] Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.226955 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrfw\" (UniqueName: \"kubernetes.io/projected/91d7b7ee-185c-4a03-b3c0-0916805643f6-kube-api-access-vrrfw\") pod \"auto-csr-approver-29566760-dp5wf\" (UID: \"91d7b7ee-185c-4a03-b3c0-0916805643f6\") " pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.327882 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrfw\" (UniqueName: \"kubernetes.io/projected/91d7b7ee-185c-4a03-b3c0-0916805643f6-kube-api-access-vrrfw\") pod \"auto-csr-approver-29566760-dp5wf\" (UID: \"91d7b7ee-185c-4a03-b3c0-0916805643f6\") " pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.348128 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrfw\" (UniqueName: \"kubernetes.io/projected/91d7b7ee-185c-4a03-b3c0-0916805643f6-kube-api-access-vrrfw\") pod \"auto-csr-approver-29566760-dp5wf\" (UID: \"91d7b7ee-185c-4a03-b3c0-0916805643f6\") " pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.463469 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.885745 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-dp5wf"] Mar 20 11:20:00 crc kubenswrapper[4695]: I0320 11:20:00.899355 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:20:01 crc kubenswrapper[4695]: I0320 11:20:01.905273 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" event={"ID":"91d7b7ee-185c-4a03-b3c0-0916805643f6","Type":"ContainerStarted","Data":"844f582bc84cff6ca95e990ca96048fcfc4fcafb1c4289c0932fdb99f634ef07"} Mar 20 11:20:04 crc kubenswrapper[4695]: I0320 11:20:04.929663 4695 generic.go:334] "Generic (PLEG): container finished" podID="91d7b7ee-185c-4a03-b3c0-0916805643f6" containerID="18e2246563997cf8f83b0d4a56e07bdc3425dde2f1ecd1af0a6665bc36d58a0b" exitCode=0 Mar 20 11:20:04 crc kubenswrapper[4695]: I0320 11:20:04.930002 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" event={"ID":"91d7b7ee-185c-4a03-b3c0-0916805643f6","Type":"ContainerDied","Data":"18e2246563997cf8f83b0d4a56e07bdc3425dde2f1ecd1af0a6665bc36d58a0b"} Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.351441 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.444660 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrrfw\" (UniqueName: \"kubernetes.io/projected/91d7b7ee-185c-4a03-b3c0-0916805643f6-kube-api-access-vrrfw\") pod \"91d7b7ee-185c-4a03-b3c0-0916805643f6\" (UID: \"91d7b7ee-185c-4a03-b3c0-0916805643f6\") " Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.450125 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91d7b7ee-185c-4a03-b3c0-0916805643f6-kube-api-access-vrrfw" (OuterVolumeSpecName: "kube-api-access-vrrfw") pod "91d7b7ee-185c-4a03-b3c0-0916805643f6" (UID: "91d7b7ee-185c-4a03-b3c0-0916805643f6"). InnerVolumeSpecName "kube-api-access-vrrfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.545779 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrrfw\" (UniqueName: \"kubernetes.io/projected/91d7b7ee-185c-4a03-b3c0-0916805643f6-kube-api-access-vrrfw\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.947301 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" event={"ID":"91d7b7ee-185c-4a03-b3c0-0916805643f6","Type":"ContainerDied","Data":"844f582bc84cff6ca95e990ca96048fcfc4fcafb1c4289c0932fdb99f634ef07"} Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.947347 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="844f582bc84cff6ca95e990ca96048fcfc4fcafb1c4289c0932fdb99f634ef07" Mar 20 11:20:06 crc kubenswrapper[4695]: I0320 11:20:06.947383 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-dp5wf" Mar 20 11:20:07 crc kubenswrapper[4695]: I0320 11:20:07.429535 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-q2pt7"] Mar 20 11:20:07 crc kubenswrapper[4695]: I0320 11:20:07.437216 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-q2pt7"] Mar 20 11:20:08 crc kubenswrapper[4695]: I0320 11:20:08.430683 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:20:08 crc kubenswrapper[4695]: I0320 11:20:08.431136 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:20:08 crc kubenswrapper[4695]: I0320 11:20:08.896193 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="decc7126-f843-4c09-aead-83b7584f1a0f" path="/var/lib/kubelet/pods/decc7126-f843-4c09-aead-83b7584f1a0f/volumes" Mar 20 11:20:26 crc kubenswrapper[4695]: I0320 11:20:26.913167 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5kng9"] Mar 20 11:20:26 crc kubenswrapper[4695]: E0320 11:20:26.914111 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91d7b7ee-185c-4a03-b3c0-0916805643f6" containerName="oc" Mar 20 11:20:26 crc kubenswrapper[4695]: I0320 11:20:26.914123 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="91d7b7ee-185c-4a03-b3c0-0916805643f6" containerName="oc" Mar 20 11:20:26 crc kubenswrapper[4695]: I0320 11:20:26.914273 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="91d7b7ee-185c-4a03-b3c0-0916805643f6" containerName="oc" Mar 20 11:20:26 crc kubenswrapper[4695]: I0320 11:20:26.915255 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:26 crc kubenswrapper[4695]: I0320 11:20:26.925472 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kng9"] Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.042853 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e3b7582-d5fb-4abc-91f1-68097a188855-utilities\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.042918 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e3b7582-d5fb-4abc-91f1-68097a188855-catalog-content\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.042952 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpx92\" (UniqueName: \"kubernetes.io/projected/7e3b7582-d5fb-4abc-91f1-68097a188855-kube-api-access-tpx92\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.144419 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e3b7582-d5fb-4abc-91f1-68097a188855-utilities\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.144483 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e3b7582-d5fb-4abc-91f1-68097a188855-catalog-content\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.144538 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpx92\" (UniqueName: \"kubernetes.io/projected/7e3b7582-d5fb-4abc-91f1-68097a188855-kube-api-access-tpx92\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.145116 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e3b7582-d5fb-4abc-91f1-68097a188855-utilities\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.145227 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e3b7582-d5fb-4abc-91f1-68097a188855-catalog-content\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.176277 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpx92\" (UniqueName: \"kubernetes.io/projected/7e3b7582-d5fb-4abc-91f1-68097a188855-kube-api-access-tpx92\") pod \"community-operators-5kng9\" (UID: \"7e3b7582-d5fb-4abc-91f1-68097a188855\") " pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.232823 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:27 crc kubenswrapper[4695]: I0320 11:20:27.701226 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kng9"] Mar 20 11:20:27 crc kubenswrapper[4695]: W0320 11:20:27.706695 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e3b7582_d5fb_4abc_91f1_68097a188855.slice/crio-0eae08147ee137096f88bce19b44f39a52f39f73929d53ff08c05c4865088c31 WatchSource:0}: Error finding container 0eae08147ee137096f88bce19b44f39a52f39f73929d53ff08c05c4865088c31: Status 404 returned error can't find the container with id 0eae08147ee137096f88bce19b44f39a52f39f73929d53ff08c05c4865088c31 Mar 20 11:20:28 crc kubenswrapper[4695]: I0320 11:20:28.102328 4695 generic.go:334] "Generic (PLEG): container finished" podID="7e3b7582-d5fb-4abc-91f1-68097a188855" containerID="021ab65e3086a55ac00d87bb9e5e18218df7a2c8cc4e33a7c8fca69664dc6bf9" exitCode=0 Mar 20 11:20:28 crc kubenswrapper[4695]: I0320 11:20:28.102766 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kng9" event={"ID":"7e3b7582-d5fb-4abc-91f1-68097a188855","Type":"ContainerDied","Data":"021ab65e3086a55ac00d87bb9e5e18218df7a2c8cc4e33a7c8fca69664dc6bf9"} Mar 20 11:20:28 crc kubenswrapper[4695]: I0320 11:20:28.102808 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kng9" event={"ID":"7e3b7582-d5fb-4abc-91f1-68097a188855","Type":"ContainerStarted","Data":"0eae08147ee137096f88bce19b44f39a52f39f73929d53ff08c05c4865088c31"} Mar 20 11:20:35 crc kubenswrapper[4695]: I0320 11:20:35.154642 4695 generic.go:334] "Generic (PLEG): container finished" podID="7e3b7582-d5fb-4abc-91f1-68097a188855" containerID="fc0e432c1c672bbba4de898545f42e2a07bb4c0a9f7e55628117058cd72e668e" exitCode=0 Mar 20 11:20:35 crc kubenswrapper[4695]: I0320 11:20:35.154702 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kng9" event={"ID":"7e3b7582-d5fb-4abc-91f1-68097a188855","Type":"ContainerDied","Data":"fc0e432c1c672bbba4de898545f42e2a07bb4c0a9f7e55628117058cd72e668e"} Mar 20 11:20:36 crc kubenswrapper[4695]: I0320 11:20:36.165051 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5kng9" event={"ID":"7e3b7582-d5fb-4abc-91f1-68097a188855","Type":"ContainerStarted","Data":"dfd2f62453a40dacab381ae0f03e648ce33ac7309a85ccd354ba1e4b54adbbd8"} Mar 20 11:20:36 crc kubenswrapper[4695]: I0320 11:20:36.187223 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5kng9" podStartSLOduration=2.475579872 podStartE2EDuration="10.187196668s" podCreationTimestamp="2026-03-20 11:20:26 +0000 UTC" firstStartedPulling="2026-03-20 11:20:28.108195652 +0000 UTC m=+1605.888801225" lastFinishedPulling="2026-03-20 11:20:35.819812448 +0000 UTC m=+1613.600418021" observedRunningTime="2026-03-20 11:20:36.182062307 +0000 UTC m=+1613.962667890" watchObservedRunningTime="2026-03-20 11:20:36.187196668 +0000 UTC m=+1613.967802231" Mar 20 11:20:37 crc kubenswrapper[4695]: I0320 11:20:37.233390 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:37 crc kubenswrapper[4695]: I0320 11:20:37.233457 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:38 crc kubenswrapper[4695]: I0320 11:20:38.302040 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-5kng9" podUID="7e3b7582-d5fb-4abc-91f1-68097a188855" containerName="registry-server" probeResult="failure" output=< Mar 20 11:20:38 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 11:20:38 crc kubenswrapper[4695]: > Mar 20 11:20:38 crc kubenswrapper[4695]: I0320 11:20:38.475240 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:20:38 crc kubenswrapper[4695]: I0320 11:20:38.475296 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:20:38 crc kubenswrapper[4695]: I0320 11:20:38.475346 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:20:38 crc kubenswrapper[4695]: I0320 11:20:38.476011 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:20:38 crc kubenswrapper[4695]: I0320 11:20:38.476067 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" gracePeriod=600 Mar 20 11:20:38 crc kubenswrapper[4695]: E0320 11:20:38.598462 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:20:39 crc kubenswrapper[4695]: I0320 11:20:39.189144 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" exitCode=0 Mar 20 11:20:39 crc kubenswrapper[4695]: I0320 11:20:39.189207 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a"} Mar 20 11:20:39 crc kubenswrapper[4695]: I0320 11:20:39.189312 4695 scope.go:117] "RemoveContainer" containerID="8bc9cc8c349e0d536cb6c63da318c52d0806648b595c149581e7bb85b97cf1a4" Mar 20 11:20:39 crc kubenswrapper[4695]: I0320 11:20:39.190071 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:20:39 crc kubenswrapper[4695]: E0320 11:20:39.190361 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:20:47 crc kubenswrapper[4695]: I0320 11:20:47.275951 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:47 crc kubenswrapper[4695]: I0320 11:20:47.324781 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5kng9" Mar 20 11:20:47 crc kubenswrapper[4695]: I0320 11:20:47.401273 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5kng9"] Mar 20 11:20:47 crc kubenswrapper[4695]: I0320 11:20:47.510044 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8f2c"] Mar 20 11:20:47 crc kubenswrapper[4695]: I0320 11:20:47.510350 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t8f2c" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="registry-server" containerID="cri-o://e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc" gracePeriod=2 Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.049017 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.186238 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzhfk\" (UniqueName: \"kubernetes.io/projected/1754a3b6-2691-4878-8fb1-38668a0e103a-kube-api-access-qzhfk\") pod \"1754a3b6-2691-4878-8fb1-38668a0e103a\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.186340 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-catalog-content\") pod \"1754a3b6-2691-4878-8fb1-38668a0e103a\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.186412 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-utilities\") pod \"1754a3b6-2691-4878-8fb1-38668a0e103a\" (UID: \"1754a3b6-2691-4878-8fb1-38668a0e103a\") " Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.187323 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-utilities" (OuterVolumeSpecName: "utilities") pod "1754a3b6-2691-4878-8fb1-38668a0e103a" (UID: "1754a3b6-2691-4878-8fb1-38668a0e103a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.192603 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1754a3b6-2691-4878-8fb1-38668a0e103a-kube-api-access-qzhfk" (OuterVolumeSpecName: "kube-api-access-qzhfk") pod "1754a3b6-2691-4878-8fb1-38668a0e103a" (UID: "1754a3b6-2691-4878-8fb1-38668a0e103a"). InnerVolumeSpecName "kube-api-access-qzhfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.247532 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1754a3b6-2691-4878-8fb1-38668a0e103a" (UID: "1754a3b6-2691-4878-8fb1-38668a0e103a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.261349 4695 generic.go:334] "Generic (PLEG): container finished" podID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerID="e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc" exitCode=0 Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.261436 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8f2c" event={"ID":"1754a3b6-2691-4878-8fb1-38668a0e103a","Type":"ContainerDied","Data":"e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc"} Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.261507 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t8f2c" event={"ID":"1754a3b6-2691-4878-8fb1-38668a0e103a","Type":"ContainerDied","Data":"7300209331cb8844b86e6d0e64b696685454f3e578477a4223533ac8aec59c14"} Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.261534 4695 scope.go:117] "RemoveContainer" containerID="e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.261778 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t8f2c" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.284281 4695 scope.go:117] "RemoveContainer" containerID="a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.287858 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.287971 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzhfk\" (UniqueName: \"kubernetes.io/projected/1754a3b6-2691-4878-8fb1-38668a0e103a-kube-api-access-qzhfk\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.288040 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1754a3b6-2691-4878-8fb1-38668a0e103a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.294257 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t8f2c"] Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.303157 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t8f2c"] Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.440698 4695 scope.go:117] "RemoveContainer" containerID="9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.480155 4695 scope.go:117] "RemoveContainer" containerID="e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc" Mar 20 11:20:48 crc kubenswrapper[4695]: E0320 11:20:48.489314 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc\": container with ID starting with e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc not found: ID does not exist" containerID="e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.489377 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc"} err="failed to get container status \"e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc\": rpc error: code = NotFound desc = could not find container \"e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc\": container with ID starting with e3de22dc5ffebea1e34435267b0774364383271b23fbb6e022ba41890e984fcc not found: ID does not exist" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.489410 4695 scope.go:117] "RemoveContainer" containerID="a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7" Mar 20 11:20:48 crc kubenswrapper[4695]: E0320 11:20:48.491034 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7\": container with ID starting with a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7 not found: ID does not exist" containerID="a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.491074 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7"} err="failed to get container status \"a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7\": rpc error: code = NotFound desc = could not find container \"a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7\": container with ID starting with a8ca521b1d9757ddde3e9799f585f58b7526afec4830d0916765da2ac5b21ca7 not found: ID does not exist" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.491106 4695 scope.go:117] "RemoveContainer" containerID="9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d" Mar 20 11:20:48 crc kubenswrapper[4695]: E0320 11:20:48.492022 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d\": container with ID starting with 9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d not found: ID does not exist" containerID="9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.492061 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d"} err="failed to get container status \"9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d\": rpc error: code = NotFound desc = could not find container \"9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d\": container with ID starting with 9d134c4f3046cc6e78911c9ef6c177e119bb563a245b897a79ba206ca733de5d not found: ID does not exist" Mar 20 11:20:48 crc kubenswrapper[4695]: I0320 11:20:48.897890 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" path="/var/lib/kubelet/pods/1754a3b6-2691-4878-8fb1-38668a0e103a/volumes" Mar 20 11:20:53 crc kubenswrapper[4695]: I0320 11:20:53.887895 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:20:53 crc kubenswrapper[4695]: E0320 11:20:53.888607 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:21:05 crc kubenswrapper[4695]: I0320 11:21:05.981112 4695 scope.go:117] "RemoveContainer" containerID="2bf15c95d48ed2fb30c6ae5cd2bad947d1425285e80ae9ee1d5ea3818a59455b" Mar 20 11:21:07 crc kubenswrapper[4695]: I0320 11:21:07.887690 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:21:07 crc kubenswrapper[4695]: E0320 11:21:07.888415 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:21:20 crc kubenswrapper[4695]: I0320 11:21:20.887460 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:21:20 crc kubenswrapper[4695]: E0320 11:21:20.888388 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:21:35 crc kubenswrapper[4695]: I0320 11:21:35.887192 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:21:35 crc kubenswrapper[4695]: E0320 11:21:35.888115 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.885146 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b4mpc"] Mar 20 11:21:36 crc kubenswrapper[4695]: E0320 11:21:36.885535 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="extract-content" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.885552 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="extract-content" Mar 20 11:21:36 crc kubenswrapper[4695]: E0320 11:21:36.885575 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="registry-server" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.885583 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="registry-server" Mar 20 11:21:36 crc kubenswrapper[4695]: E0320 11:21:36.885604 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="extract-utilities" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.885611 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="extract-utilities" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.885819 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="1754a3b6-2691-4878-8fb1-38668a0e103a" containerName="registry-server" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.887303 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:36 crc kubenswrapper[4695]: I0320 11:21:36.901567 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4mpc"] Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:36.954294 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djn6f\" (UniqueName: \"kubernetes.io/projected/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-kube-api-access-djn6f\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:36.954597 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-utilities\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:36.954661 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-catalog-content\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.060392 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-utilities\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.060471 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-catalog-content\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.060512 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djn6f\" (UniqueName: \"kubernetes.io/projected/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-kube-api-access-djn6f\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.140439 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-catalog-content\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.143804 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-utilities\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.184541 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djn6f\" (UniqueName: \"kubernetes.io/projected/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-kube-api-access-djn6f\") pod \"redhat-marketplace-b4mpc\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.211134 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:37 crc kubenswrapper[4695]: I0320 11:21:37.722528 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4mpc"] Mar 20 11:21:38 crc kubenswrapper[4695]: I0320 11:21:38.125191 4695 generic.go:334] "Generic (PLEG): container finished" podID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerID="f581aba12f1a67820577b18b3a311c2bffef7a509cb609aa7a30c9d8afd11789" exitCode=0 Mar 20 11:21:38 crc kubenswrapper[4695]: I0320 11:21:38.125239 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4mpc" event={"ID":"d92bd8f9-35ea-46d9-a902-fc14a698fd4c","Type":"ContainerDied","Data":"f581aba12f1a67820577b18b3a311c2bffef7a509cb609aa7a30c9d8afd11789"} Mar 20 11:21:38 crc kubenswrapper[4695]: I0320 11:21:38.125292 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4mpc" event={"ID":"d92bd8f9-35ea-46d9-a902-fc14a698fd4c","Type":"ContainerStarted","Data":"87e1c48d276b65bcc24b9cb8c7588e976029e1a4d88a9dd1abf8c4678c8ac7fa"} Mar 20 11:21:40 crc kubenswrapper[4695]: I0320 11:21:40.142464 4695 generic.go:334] "Generic (PLEG): container finished" podID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerID="ec8b9510d9d7dd9b8bf7c50e8e8ea36d6155654ec48071d0a98b7a80227d25cc" exitCode=0 Mar 20 11:21:40 crc kubenswrapper[4695]: I0320 11:21:40.142544 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4mpc" event={"ID":"d92bd8f9-35ea-46d9-a902-fc14a698fd4c","Type":"ContainerDied","Data":"ec8b9510d9d7dd9b8bf7c50e8e8ea36d6155654ec48071d0a98b7a80227d25cc"} Mar 20 11:21:41 crc kubenswrapper[4695]: I0320 11:21:41.155256 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4mpc" event={"ID":"d92bd8f9-35ea-46d9-a902-fc14a698fd4c","Type":"ContainerStarted","Data":"00e610d3637c108a2b3299141bbadc1a51155b2e278cec9ef1fa8b1ad94d718e"} Mar 20 11:21:41 crc kubenswrapper[4695]: I0320 11:21:41.181649 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b4mpc" podStartSLOduration=2.652314736 podStartE2EDuration="5.181627595s" podCreationTimestamp="2026-03-20 11:21:36 +0000 UTC" firstStartedPulling="2026-03-20 11:21:38.126762465 +0000 UTC m=+1675.907368028" lastFinishedPulling="2026-03-20 11:21:40.656075304 +0000 UTC m=+1678.436680887" observedRunningTime="2026-03-20 11:21:41.175400146 +0000 UTC m=+1678.956005709" watchObservedRunningTime="2026-03-20 11:21:41.181627595 +0000 UTC m=+1678.962233148" Mar 20 11:21:47 crc kubenswrapper[4695]: I0320 11:21:47.211932 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:47 crc kubenswrapper[4695]: I0320 11:21:47.214177 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:47 crc kubenswrapper[4695]: I0320 11:21:47.258368 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:48 crc kubenswrapper[4695]: I0320 11:21:48.251105 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:48 crc kubenswrapper[4695]: I0320 11:21:48.309946 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4mpc"] Mar 20 11:21:48 crc kubenswrapper[4695]: I0320 11:21:48.887121 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:21:48 crc kubenswrapper[4695]: E0320 11:21:48.887396 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:21:50 crc kubenswrapper[4695]: I0320 11:21:50.213584 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b4mpc" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="registry-server" containerID="cri-o://00e610d3637c108a2b3299141bbadc1a51155b2e278cec9ef1fa8b1ad94d718e" gracePeriod=2 Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.223495 4695 generic.go:334] "Generic (PLEG): container finished" podID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerID="00e610d3637c108a2b3299141bbadc1a51155b2e278cec9ef1fa8b1ad94d718e" exitCode=0 Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.223544 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4mpc" event={"ID":"d92bd8f9-35ea-46d9-a902-fc14a698fd4c","Type":"ContainerDied","Data":"00e610d3637c108a2b3299141bbadc1a51155b2e278cec9ef1fa8b1ad94d718e"} Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.781255 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.931742 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djn6f\" (UniqueName: \"kubernetes.io/projected/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-kube-api-access-djn6f\") pod \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.931980 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-utilities\") pod \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.932058 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-catalog-content\") pod \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\" (UID: \"d92bd8f9-35ea-46d9-a902-fc14a698fd4c\") " Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.933151 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-utilities" (OuterVolumeSpecName: "utilities") pod "d92bd8f9-35ea-46d9-a902-fc14a698fd4c" (UID: "d92bd8f9-35ea-46d9-a902-fc14a698fd4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.938194 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-kube-api-access-djn6f" (OuterVolumeSpecName: "kube-api-access-djn6f") pod "d92bd8f9-35ea-46d9-a902-fc14a698fd4c" (UID: "d92bd8f9-35ea-46d9-a902-fc14a698fd4c"). InnerVolumeSpecName "kube-api-access-djn6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:21:51 crc kubenswrapper[4695]: I0320 11:21:51.962341 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d92bd8f9-35ea-46d9-a902-fc14a698fd4c" (UID: "d92bd8f9-35ea-46d9-a902-fc14a698fd4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.033923 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.033961 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.033973 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djn6f\" (UniqueName: \"kubernetes.io/projected/d92bd8f9-35ea-46d9-a902-fc14a698fd4c-kube-api-access-djn6f\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.233118 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b4mpc" event={"ID":"d92bd8f9-35ea-46d9-a902-fc14a698fd4c","Type":"ContainerDied","Data":"87e1c48d276b65bcc24b9cb8c7588e976029e1a4d88a9dd1abf8c4678c8ac7fa"} Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.233164 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b4mpc" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.233186 4695 scope.go:117] "RemoveContainer" containerID="00e610d3637c108a2b3299141bbadc1a51155b2e278cec9ef1fa8b1ad94d718e" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.253291 4695 scope.go:117] "RemoveContainer" containerID="ec8b9510d9d7dd9b8bf7c50e8e8ea36d6155654ec48071d0a98b7a80227d25cc" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.268999 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4mpc"] Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.276118 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b4mpc"] Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.291662 4695 scope.go:117] "RemoveContainer" containerID="f581aba12f1a67820577b18b3a311c2bffef7a509cb609aa7a30c9d8afd11789" Mar 20 11:21:52 crc kubenswrapper[4695]: I0320 11:21:52.896434 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" path="/var/lib/kubelet/pods/d92bd8f9-35ea-46d9-a902-fc14a698fd4c/volumes" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.152620 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566762-qsjfb"] Mar 20 11:22:00 crc kubenswrapper[4695]: E0320 11:22:00.153642 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.153655 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4695]: E0320 11:22:00.153693 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.153699 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4695]: E0320 11:22:00.153713 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.153719 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.153845 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92bd8f9-35ea-46d9-a902-fc14a698fd4c" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.154400 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.156741 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.156981 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.157614 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.160205 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-qsjfb"] Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.290987 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx4rc\" (UniqueName: \"kubernetes.io/projected/dba5fb66-7070-4236-8f80-64a556e4d7dd-kube-api-access-xx4rc\") pod \"auto-csr-approver-29566762-qsjfb\" (UID: \"dba5fb66-7070-4236-8f80-64a556e4d7dd\") " pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.392587 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx4rc\" (UniqueName: \"kubernetes.io/projected/dba5fb66-7070-4236-8f80-64a556e4d7dd-kube-api-access-xx4rc\") pod \"auto-csr-approver-29566762-qsjfb\" (UID: \"dba5fb66-7070-4236-8f80-64a556e4d7dd\") " pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.417395 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx4rc\" (UniqueName: \"kubernetes.io/projected/dba5fb66-7070-4236-8f80-64a556e4d7dd-kube-api-access-xx4rc\") pod \"auto-csr-approver-29566762-qsjfb\" (UID: \"dba5fb66-7070-4236-8f80-64a556e4d7dd\") " pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.472329 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:00 crc kubenswrapper[4695]: I0320 11:22:00.933547 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-qsjfb"] Mar 20 11:22:01 crc kubenswrapper[4695]: I0320 11:22:01.309121 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" event={"ID":"dba5fb66-7070-4236-8f80-64a556e4d7dd","Type":"ContainerStarted","Data":"5a7cdd5f8e88bd9651b9dfd83c1f2447bf05d0b5ee6e7504184e53b8c9cd0b9e"} Mar 20 11:22:01 crc kubenswrapper[4695]: I0320 11:22:01.886881 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:22:01 crc kubenswrapper[4695]: E0320 11:22:01.887750 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:22:04 crc kubenswrapper[4695]: I0320 11:22:04.338581 4695 generic.go:334] "Generic (PLEG): container finished" podID="dba5fb66-7070-4236-8f80-64a556e4d7dd" containerID="d84f57ba660157dc5fa3ba103a0b87ba24457cf373ba5d39f8f6c875a64d8a8f" exitCode=0 Mar 20 11:22:04 crc kubenswrapper[4695]: I0320 11:22:04.338674 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" event={"ID":"dba5fb66-7070-4236-8f80-64a556e4d7dd","Type":"ContainerDied","Data":"d84f57ba660157dc5fa3ba103a0b87ba24457cf373ba5d39f8f6c875a64d8a8f"} Mar 20 11:22:05 crc kubenswrapper[4695]: I0320 11:22:05.647602 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:05 crc kubenswrapper[4695]: I0320 11:22:05.773926 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx4rc\" (UniqueName: \"kubernetes.io/projected/dba5fb66-7070-4236-8f80-64a556e4d7dd-kube-api-access-xx4rc\") pod \"dba5fb66-7070-4236-8f80-64a556e4d7dd\" (UID: \"dba5fb66-7070-4236-8f80-64a556e4d7dd\") " Mar 20 11:22:05 crc kubenswrapper[4695]: I0320 11:22:05.780298 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dba5fb66-7070-4236-8f80-64a556e4d7dd-kube-api-access-xx4rc" (OuterVolumeSpecName: "kube-api-access-xx4rc") pod "dba5fb66-7070-4236-8f80-64a556e4d7dd" (UID: "dba5fb66-7070-4236-8f80-64a556e4d7dd"). InnerVolumeSpecName "kube-api-access-xx4rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:05 crc kubenswrapper[4695]: I0320 11:22:05.875977 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx4rc\" (UniqueName: \"kubernetes.io/projected/dba5fb66-7070-4236-8f80-64a556e4d7dd-kube-api-access-xx4rc\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:06 crc kubenswrapper[4695]: I0320 11:22:06.354472 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" event={"ID":"dba5fb66-7070-4236-8f80-64a556e4d7dd","Type":"ContainerDied","Data":"5a7cdd5f8e88bd9651b9dfd83c1f2447bf05d0b5ee6e7504184e53b8c9cd0b9e"} Mar 20 11:22:06 crc kubenswrapper[4695]: I0320 11:22:06.354525 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a7cdd5f8e88bd9651b9dfd83c1f2447bf05d0b5ee6e7504184e53b8c9cd0b9e" Mar 20 11:22:06 crc kubenswrapper[4695]: I0320 11:22:06.354547 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-qsjfb" Mar 20 11:22:06 crc kubenswrapper[4695]: I0320 11:22:06.742513 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-l5sts"] Mar 20 11:22:06 crc kubenswrapper[4695]: I0320 11:22:06.749098 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-l5sts"] Mar 20 11:22:06 crc kubenswrapper[4695]: I0320 11:22:06.895668 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596046b2-828a-4fb3-b57f-db82066f115e" path="/var/lib/kubelet/pods/596046b2-828a-4fb3-b57f-db82066f115e/volumes" Mar 20 11:22:16 crc kubenswrapper[4695]: I0320 11:22:16.887129 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:22:16 crc kubenswrapper[4695]: E0320 11:22:16.888056 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:22:28 crc kubenswrapper[4695]: I0320 11:22:28.886878 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:22:28 crc kubenswrapper[4695]: E0320 11:22:28.887854 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:22:40 crc kubenswrapper[4695]: I0320 11:22:40.887591 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:22:40 crc kubenswrapper[4695]: E0320 11:22:40.888549 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:22:54 crc kubenswrapper[4695]: I0320 11:22:54.887510 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:22:54 crc kubenswrapper[4695]: E0320 11:22:54.888516 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:23:06 crc kubenswrapper[4695]: I0320 11:23:06.118669 4695 scope.go:117] "RemoveContainer" containerID="fbc5ef08a27a876cba3befb5d973447021cc6182e120c230541491c63c957df7" Mar 20 11:23:09 crc kubenswrapper[4695]: I0320 11:23:09.887400 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:23:09 crc kubenswrapper[4695]: E0320 11:23:09.888203 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:23:22 crc kubenswrapper[4695]: I0320 11:23:22.893695 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:23:22 crc kubenswrapper[4695]: E0320 11:23:22.894892 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:23:37 crc kubenswrapper[4695]: I0320 11:23:37.886975 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:23:37 crc kubenswrapper[4695]: E0320 11:23:37.887931 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:23:49 crc kubenswrapper[4695]: I0320 11:23:49.887640 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:23:49 crc kubenswrapper[4695]: E0320 11:23:49.888477 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.146342 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566764-gwbpr"] Mar 20 11:24:00 crc kubenswrapper[4695]: E0320 11:24:00.147577 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dba5fb66-7070-4236-8f80-64a556e4d7dd" containerName="oc" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.147596 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="dba5fb66-7070-4236-8f80-64a556e4d7dd" containerName="oc" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.147751 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="dba5fb66-7070-4236-8f80-64a556e4d7dd" containerName="oc" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.148446 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.150899 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.151512 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.151656 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.153172 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-gwbpr"] Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.242275 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmcgl\" (UniqueName: \"kubernetes.io/projected/d30de059-4384-4d9b-bffe-7f0133eb3c2f-kube-api-access-dmcgl\") pod \"auto-csr-approver-29566764-gwbpr\" (UID: \"d30de059-4384-4d9b-bffe-7f0133eb3c2f\") " pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.343578 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmcgl\" (UniqueName: \"kubernetes.io/projected/d30de059-4384-4d9b-bffe-7f0133eb3c2f-kube-api-access-dmcgl\") pod \"auto-csr-approver-29566764-gwbpr\" (UID: \"d30de059-4384-4d9b-bffe-7f0133eb3c2f\") " pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.365978 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmcgl\" (UniqueName: \"kubernetes.io/projected/d30de059-4384-4d9b-bffe-7f0133eb3c2f-kube-api-access-dmcgl\") pod \"auto-csr-approver-29566764-gwbpr\" (UID: \"d30de059-4384-4d9b-bffe-7f0133eb3c2f\") " pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.473816 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:00 crc kubenswrapper[4695]: I0320 11:24:00.907995 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-gwbpr"] Mar 20 11:24:01 crc kubenswrapper[4695]: I0320 11:24:01.214294 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" event={"ID":"d30de059-4384-4d9b-bffe-7f0133eb3c2f","Type":"ContainerStarted","Data":"544644a41955477c94fc74f22cce5c7d3e58595abd85094099ac3f7c6ebc16d6"} Mar 20 11:24:02 crc kubenswrapper[4695]: I0320 11:24:02.894369 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:24:02 crc kubenswrapper[4695]: E0320 11:24:02.894970 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:24:04 crc kubenswrapper[4695]: I0320 11:24:04.254237 4695 generic.go:334] "Generic (PLEG): container finished" podID="d30de059-4384-4d9b-bffe-7f0133eb3c2f" containerID="a970e576a05b9bd8dd9d5e1ad2049639edcd0b88c8962334a9b07092c5ba3d03" exitCode=0 Mar 20 11:24:04 crc kubenswrapper[4695]: I0320 11:24:04.254675 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" event={"ID":"d30de059-4384-4d9b-bffe-7f0133eb3c2f","Type":"ContainerDied","Data":"a970e576a05b9bd8dd9d5e1ad2049639edcd0b88c8962334a9b07092c5ba3d03"} Mar 20 11:24:05 crc kubenswrapper[4695]: I0320 11:24:05.552504 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:05 crc kubenswrapper[4695]: I0320 11:24:05.621487 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmcgl\" (UniqueName: \"kubernetes.io/projected/d30de059-4384-4d9b-bffe-7f0133eb3c2f-kube-api-access-dmcgl\") pod \"d30de059-4384-4d9b-bffe-7f0133eb3c2f\" (UID: \"d30de059-4384-4d9b-bffe-7f0133eb3c2f\") " Mar 20 11:24:05 crc kubenswrapper[4695]: I0320 11:24:05.628480 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30de059-4384-4d9b-bffe-7f0133eb3c2f-kube-api-access-dmcgl" (OuterVolumeSpecName: "kube-api-access-dmcgl") pod "d30de059-4384-4d9b-bffe-7f0133eb3c2f" (UID: "d30de059-4384-4d9b-bffe-7f0133eb3c2f"). InnerVolumeSpecName "kube-api-access-dmcgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:24:05 crc kubenswrapper[4695]: I0320 11:24:05.723328 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmcgl\" (UniqueName: \"kubernetes.io/projected/d30de059-4384-4d9b-bffe-7f0133eb3c2f-kube-api-access-dmcgl\") on node \"crc\" DevicePath \"\"" Mar 20 11:24:06 crc kubenswrapper[4695]: I0320 11:24:06.272330 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" event={"ID":"d30de059-4384-4d9b-bffe-7f0133eb3c2f","Type":"ContainerDied","Data":"544644a41955477c94fc74f22cce5c7d3e58595abd85094099ac3f7c6ebc16d6"} Mar 20 11:24:06 crc kubenswrapper[4695]: I0320 11:24:06.272836 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="544644a41955477c94fc74f22cce5c7d3e58595abd85094099ac3f7c6ebc16d6" Mar 20 11:24:06 crc kubenswrapper[4695]: I0320 11:24:06.272431 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-gwbpr" Mar 20 11:24:06 crc kubenswrapper[4695]: I0320 11:24:06.625803 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-wgd6c"] Mar 20 11:24:06 crc kubenswrapper[4695]: I0320 11:24:06.635975 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-wgd6c"] Mar 20 11:24:06 crc kubenswrapper[4695]: I0320 11:24:06.897054 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b2e67f3-2381-404e-95e8-3fdeb41852ee" path="/var/lib/kubelet/pods/5b2e67f3-2381-404e-95e8-3fdeb41852ee/volumes" Mar 20 11:24:17 crc kubenswrapper[4695]: I0320 11:24:17.887020 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:24:17 crc kubenswrapper[4695]: E0320 11:24:17.887986 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:24:31 crc kubenswrapper[4695]: I0320 11:24:31.887090 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:24:31 crc kubenswrapper[4695]: E0320 11:24:31.887982 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:24:45 crc kubenswrapper[4695]: I0320 11:24:45.887505 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:24:45 crc kubenswrapper[4695]: E0320 11:24:45.888647 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:25:00 crc kubenswrapper[4695]: I0320 11:25:00.889594 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:25:00 crc kubenswrapper[4695]: E0320 11:25:00.890789 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:25:06 crc kubenswrapper[4695]: I0320 11:25:06.211693 4695 scope.go:117] "RemoveContainer" containerID="bb86d9183473df9c2977058fcef2b472677a69bf62496a75f5dce79f19b60bec" Mar 20 11:25:14 crc kubenswrapper[4695]: I0320 11:25:14.887385 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:25:14 crc kubenswrapper[4695]: E0320 11:25:14.888324 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:25:27 crc kubenswrapper[4695]: I0320 11:25:27.886650 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:25:27 crc kubenswrapper[4695]: E0320 11:25:27.887687 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:25:39 crc kubenswrapper[4695]: I0320 11:25:39.887578 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:25:41 crc kubenswrapper[4695]: I0320 11:25:41.179372 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"9ebbe35b81a4e25aff4f5292672403a71d315b60bc3a2d22e857c4f0326ec31a"} Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.152416 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lg4t7"] Mar 20 11:26:00 crc kubenswrapper[4695]: E0320 11:26:00.153677 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30de059-4384-4d9b-bffe-7f0133eb3c2f" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.153696 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30de059-4384-4d9b-bffe-7f0133eb3c2f" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.153878 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30de059-4384-4d9b-bffe-7f0133eb3c2f" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.155963 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.159842 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.161279 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.161289 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.162288 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lg4t7"] Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.274162 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj274\" (UniqueName: \"kubernetes.io/projected/5539e8e1-a981-4038-8f12-530a9528dbb2-kube-api-access-gj274\") pod \"auto-csr-approver-29566766-lg4t7\" (UID: \"5539e8e1-a981-4038-8f12-530a9528dbb2\") " pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.375189 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj274\" (UniqueName: \"kubernetes.io/projected/5539e8e1-a981-4038-8f12-530a9528dbb2-kube-api-access-gj274\") pod \"auto-csr-approver-29566766-lg4t7\" (UID: \"5539e8e1-a981-4038-8f12-530a9528dbb2\") " pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.399968 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj274\" (UniqueName: \"kubernetes.io/projected/5539e8e1-a981-4038-8f12-530a9528dbb2-kube-api-access-gj274\") pod \"auto-csr-approver-29566766-lg4t7\" (UID: \"5539e8e1-a981-4038-8f12-530a9528dbb2\") " pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.479314 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.929626 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lg4t7"] Mar 20 11:26:00 crc kubenswrapper[4695]: W0320 11:26:00.935048 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5539e8e1_a981_4038_8f12_530a9528dbb2.slice/crio-349c02177eaba99a88be291877d28e0ddf9b73a3f8347a06970f5441eb980205 WatchSource:0}: Error finding container 349c02177eaba99a88be291877d28e0ddf9b73a3f8347a06970f5441eb980205: Status 404 returned error can't find the container with id 349c02177eaba99a88be291877d28e0ddf9b73a3f8347a06970f5441eb980205 Mar 20 11:26:00 crc kubenswrapper[4695]: I0320 11:26:00.937739 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:26:01 crc kubenswrapper[4695]: I0320 11:26:01.396451 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" event={"ID":"5539e8e1-a981-4038-8f12-530a9528dbb2","Type":"ContainerStarted","Data":"349c02177eaba99a88be291877d28e0ddf9b73a3f8347a06970f5441eb980205"} Mar 20 11:26:02 crc kubenswrapper[4695]: I0320 11:26:02.406808 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" event={"ID":"5539e8e1-a981-4038-8f12-530a9528dbb2","Type":"ContainerStarted","Data":"bf1453b540dd334fcc4dd2d405696be52af6ee20cc21d5d5c5ae3f40a8cea434"} Mar 20 11:26:02 crc kubenswrapper[4695]: I0320 11:26:02.423587 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" podStartSLOduration=1.420651217 podStartE2EDuration="2.423562474s" podCreationTimestamp="2026-03-20 11:26:00 +0000 UTC" firstStartedPulling="2026-03-20 11:26:00.937446805 +0000 UTC m=+1938.718052368" lastFinishedPulling="2026-03-20 11:26:01.940358062 +0000 UTC m=+1939.720963625" observedRunningTime="2026-03-20 11:26:02.421730348 +0000 UTC m=+1940.202335921" watchObservedRunningTime="2026-03-20 11:26:02.423562474 +0000 UTC m=+1940.204168037" Mar 20 11:26:03 crc kubenswrapper[4695]: I0320 11:26:03.416270 4695 generic.go:334] "Generic (PLEG): container finished" podID="5539e8e1-a981-4038-8f12-530a9528dbb2" containerID="bf1453b540dd334fcc4dd2d405696be52af6ee20cc21d5d5c5ae3f40a8cea434" exitCode=0 Mar 20 11:26:03 crc kubenswrapper[4695]: I0320 11:26:03.416354 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" event={"ID":"5539e8e1-a981-4038-8f12-530a9528dbb2","Type":"ContainerDied","Data":"bf1453b540dd334fcc4dd2d405696be52af6ee20cc21d5d5c5ae3f40a8cea434"} Mar 20 11:26:04 crc kubenswrapper[4695]: I0320 11:26:04.732665 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:04 crc kubenswrapper[4695]: I0320 11:26:04.858170 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj274\" (UniqueName: \"kubernetes.io/projected/5539e8e1-a981-4038-8f12-530a9528dbb2-kube-api-access-gj274\") pod \"5539e8e1-a981-4038-8f12-530a9528dbb2\" (UID: \"5539e8e1-a981-4038-8f12-530a9528dbb2\") " Mar 20 11:26:04 crc kubenswrapper[4695]: I0320 11:26:04.867069 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5539e8e1-a981-4038-8f12-530a9528dbb2-kube-api-access-gj274" (OuterVolumeSpecName: "kube-api-access-gj274") pod "5539e8e1-a981-4038-8f12-530a9528dbb2" (UID: "5539e8e1-a981-4038-8f12-530a9528dbb2"). InnerVolumeSpecName "kube-api-access-gj274". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:26:04 crc kubenswrapper[4695]: I0320 11:26:04.960076 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj274\" (UniqueName: \"kubernetes.io/projected/5539e8e1-a981-4038-8f12-530a9528dbb2-kube-api-access-gj274\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:05 crc kubenswrapper[4695]: I0320 11:26:05.437504 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" event={"ID":"5539e8e1-a981-4038-8f12-530a9528dbb2","Type":"ContainerDied","Data":"349c02177eaba99a88be291877d28e0ddf9b73a3f8347a06970f5441eb980205"} Mar 20 11:26:05 crc kubenswrapper[4695]: I0320 11:26:05.437567 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="349c02177eaba99a88be291877d28e0ddf9b73a3f8347a06970f5441eb980205" Mar 20 11:26:05 crc kubenswrapper[4695]: I0320 11:26:05.437587 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lg4t7" Mar 20 11:26:05 crc kubenswrapper[4695]: I0320 11:26:05.501716 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-dp5wf"] Mar 20 11:26:05 crc kubenswrapper[4695]: I0320 11:26:05.506879 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-dp5wf"] Mar 20 11:26:06 crc kubenswrapper[4695]: I0320 11:26:06.895845 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91d7b7ee-185c-4a03-b3c0-0916805643f6" path="/var/lib/kubelet/pods/91d7b7ee-185c-4a03-b3c0-0916805643f6/volumes" Mar 20 11:27:06 crc kubenswrapper[4695]: I0320 11:27:06.340744 4695 scope.go:117] "RemoveContainer" containerID="18e2246563997cf8f83b0d4a56e07bdc3425dde2f1ecd1af0a6665bc36d58a0b" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.155196 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r275f"] Mar 20 11:28:00 crc kubenswrapper[4695]: E0320 11:28:00.156468 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5539e8e1-a981-4038-8f12-530a9528dbb2" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.156489 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5539e8e1-a981-4038-8f12-530a9528dbb2" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.163013 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="5539e8e1-a981-4038-8f12-530a9528dbb2" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.163598 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r275f"] Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.163696 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.166213 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.166873 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.167114 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.306933 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mh8zn\" (UniqueName: \"kubernetes.io/projected/0b99841f-5ced-44a9-baff-04c5e75c5fb0-kube-api-access-mh8zn\") pod \"auto-csr-approver-29566768-r275f\" (UID: \"0b99841f-5ced-44a9-baff-04c5e75c5fb0\") " pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.408145 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mh8zn\" (UniqueName: \"kubernetes.io/projected/0b99841f-5ced-44a9-baff-04c5e75c5fb0-kube-api-access-mh8zn\") pod \"auto-csr-approver-29566768-r275f\" (UID: \"0b99841f-5ced-44a9-baff-04c5e75c5fb0\") " pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.435818 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mh8zn\" (UniqueName: \"kubernetes.io/projected/0b99841f-5ced-44a9-baff-04c5e75c5fb0-kube-api-access-mh8zn\") pod \"auto-csr-approver-29566768-r275f\" (UID: \"0b99841f-5ced-44a9-baff-04c5e75c5fb0\") " pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.490368 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:00 crc kubenswrapper[4695]: I0320 11:28:00.988190 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r275f"] Mar 20 11:28:01 crc kubenswrapper[4695]: I0320 11:28:01.740029 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r275f" event={"ID":"0b99841f-5ced-44a9-baff-04c5e75c5fb0","Type":"ContainerStarted","Data":"ec032791de4bc92b9cd48241b7f3f1a900309cbdea108388c85794ca3d2c65be"} Mar 20 11:28:03 crc kubenswrapper[4695]: I0320 11:28:03.755216 4695 generic.go:334] "Generic (PLEG): container finished" podID="0b99841f-5ced-44a9-baff-04c5e75c5fb0" containerID="ca6707bc535c3aa99ca292467ec7b3b6ae4c1386a3dc23b9735ad120fae39464" exitCode=0 Mar 20 11:28:03 crc kubenswrapper[4695]: I0320 11:28:03.755306 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r275f" event={"ID":"0b99841f-5ced-44a9-baff-04c5e75c5fb0","Type":"ContainerDied","Data":"ca6707bc535c3aa99ca292467ec7b3b6ae4c1386a3dc23b9735ad120fae39464"} Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.036326 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.230368 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mh8zn\" (UniqueName: \"kubernetes.io/projected/0b99841f-5ced-44a9-baff-04c5e75c5fb0-kube-api-access-mh8zn\") pod \"0b99841f-5ced-44a9-baff-04c5e75c5fb0\" (UID: \"0b99841f-5ced-44a9-baff-04c5e75c5fb0\") " Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.237314 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b99841f-5ced-44a9-baff-04c5e75c5fb0-kube-api-access-mh8zn" (OuterVolumeSpecName: "kube-api-access-mh8zn") pod "0b99841f-5ced-44a9-baff-04c5e75c5fb0" (UID: "0b99841f-5ced-44a9-baff-04c5e75c5fb0"). InnerVolumeSpecName "kube-api-access-mh8zn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.332623 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mh8zn\" (UniqueName: \"kubernetes.io/projected/0b99841f-5ced-44a9-baff-04c5e75c5fb0-kube-api-access-mh8zn\") on node \"crc\" DevicePath \"\"" Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.771754 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-r275f" event={"ID":"0b99841f-5ced-44a9-baff-04c5e75c5fb0","Type":"ContainerDied","Data":"ec032791de4bc92b9cd48241b7f3f1a900309cbdea108388c85794ca3d2c65be"} Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.772198 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec032791de4bc92b9cd48241b7f3f1a900309cbdea108388c85794ca3d2c65be" Mar 20 11:28:05 crc kubenswrapper[4695]: I0320 11:28:05.771895 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-r275f" Mar 20 11:28:06 crc kubenswrapper[4695]: I0320 11:28:06.128382 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-qsjfb"] Mar 20 11:28:06 crc kubenswrapper[4695]: I0320 11:28:06.133380 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-qsjfb"] Mar 20 11:28:06 crc kubenswrapper[4695]: I0320 11:28:06.928564 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dba5fb66-7070-4236-8f80-64a556e4d7dd" path="/var/lib/kubelet/pods/dba5fb66-7070-4236-8f80-64a556e4d7dd/volumes" Mar 20 11:28:08 crc kubenswrapper[4695]: I0320 11:28:08.431031 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:28:08 crc kubenswrapper[4695]: I0320 11:28:08.431134 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:28:38 crc kubenswrapper[4695]: I0320 11:28:38.431562 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:28:38 crc kubenswrapper[4695]: I0320 11:28:38.432519 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.763790 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hxhzm"] Mar 20 11:29:01 crc kubenswrapper[4695]: E0320 11:29:01.767896 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b99841f-5ced-44a9-baff-04c5e75c5fb0" containerName="oc" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.768484 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b99841f-5ced-44a9-baff-04c5e75c5fb0" containerName="oc" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.769761 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b99841f-5ced-44a9-baff-04c5e75c5fb0" containerName="oc" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.771095 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.774894 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxhzm"] Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.789547 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-catalog-content\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.789599 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7zbp\" (UniqueName: \"kubernetes.io/projected/41785d21-37b6-4226-a059-0fc062e4cfc2-kube-api-access-r7zbp\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.789635 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-utilities\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.890531 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-catalog-content\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.890584 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7zbp\" (UniqueName: \"kubernetes.io/projected/41785d21-37b6-4226-a059-0fc062e4cfc2-kube-api-access-r7zbp\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.890621 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-utilities\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.891562 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-utilities\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.891830 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-catalog-content\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:01 crc kubenswrapper[4695]: I0320 11:29:01.915129 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7zbp\" (UniqueName: \"kubernetes.io/projected/41785d21-37b6-4226-a059-0fc062e4cfc2-kube-api-access-r7zbp\") pod \"certified-operators-hxhzm\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:02 crc kubenswrapper[4695]: I0320 11:29:02.096524 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:02 crc kubenswrapper[4695]: I0320 11:29:02.615815 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hxhzm"] Mar 20 11:29:03 crc kubenswrapper[4695]: I0320 11:29:03.257359 4695 generic.go:334] "Generic (PLEG): container finished" podID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerID="b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084" exitCode=0 Mar 20 11:29:03 crc kubenswrapper[4695]: I0320 11:29:03.257406 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerDied","Data":"b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084"} Mar 20 11:29:03 crc kubenswrapper[4695]: I0320 11:29:03.257433 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerStarted","Data":"d642aabf67cf1e82ffd8ed88b59eea0b55bbbd9c530eb511e20c17d76a190874"} Mar 20 11:29:04 crc kubenswrapper[4695]: I0320 11:29:04.266007 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerStarted","Data":"3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a"} Mar 20 11:29:05 crc kubenswrapper[4695]: I0320 11:29:05.277421 4695 generic.go:334] "Generic (PLEG): container finished" podID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerID="3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a" exitCode=0 Mar 20 11:29:05 crc kubenswrapper[4695]: I0320 11:29:05.277493 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerDied","Data":"3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a"} Mar 20 11:29:06 crc kubenswrapper[4695]: I0320 11:29:06.435270 4695 scope.go:117] "RemoveContainer" containerID="d84f57ba660157dc5fa3ba103a0b87ba24457cf373ba5d39f8f6c875a64d8a8f" Mar 20 11:29:06 crc kubenswrapper[4695]: I0320 11:29:06.960089 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mvjkq"] Mar 20 11:29:06 crc kubenswrapper[4695]: I0320 11:29:06.962419 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:06 crc kubenswrapper[4695]: I0320 11:29:06.979074 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvjkq"] Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.061903 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-utilities\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.061987 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-catalog-content\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.062008 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqxvk\" (UniqueName: \"kubernetes.io/projected/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-kube-api-access-pqxvk\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.163330 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-utilities\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.163458 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-catalog-content\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.163492 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqxvk\" (UniqueName: \"kubernetes.io/projected/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-kube-api-access-pqxvk\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.163937 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-utilities\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.164008 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-catalog-content\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.198441 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqxvk\" (UniqueName: \"kubernetes.io/projected/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-kube-api-access-pqxvk\") pod \"redhat-operators-mvjkq\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.285768 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.292782 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerStarted","Data":"4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644"} Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.318023 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hxhzm" podStartSLOduration=2.887332445 podStartE2EDuration="6.318006659s" podCreationTimestamp="2026-03-20 11:29:01 +0000 UTC" firstStartedPulling="2026-03-20 11:29:03.266141717 +0000 UTC m=+2121.046747270" lastFinishedPulling="2026-03-20 11:29:06.696815921 +0000 UTC m=+2124.477421484" observedRunningTime="2026-03-20 11:29:07.315582147 +0000 UTC m=+2125.096187700" watchObservedRunningTime="2026-03-20 11:29:07.318006659 +0000 UTC m=+2125.098612222" Mar 20 11:29:07 crc kubenswrapper[4695]: I0320 11:29:07.693851 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mvjkq"] Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.301026 4695 generic.go:334] "Generic (PLEG): container finished" podID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerID="6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69" exitCode=0 Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.301084 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerDied","Data":"6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69"} Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.301160 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerStarted","Data":"7d6109fcffd59641b90cf84b436b69e57c53b328435171e05dcde17e76f04ae8"} Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.431430 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.431799 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.431859 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.432697 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9ebbe35b81a4e25aff4f5292672403a71d315b60bc3a2d22e857c4f0326ec31a"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:29:08 crc kubenswrapper[4695]: I0320 11:29:08.432774 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://9ebbe35b81a4e25aff4f5292672403a71d315b60bc3a2d22e857c4f0326ec31a" gracePeriod=600 Mar 20 11:29:09 crc kubenswrapper[4695]: I0320 11:29:09.310113 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="9ebbe35b81a4e25aff4f5292672403a71d315b60bc3a2d22e857c4f0326ec31a" exitCode=0 Mar 20 11:29:09 crc kubenswrapper[4695]: I0320 11:29:09.310197 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"9ebbe35b81a4e25aff4f5292672403a71d315b60bc3a2d22e857c4f0326ec31a"} Mar 20 11:29:09 crc kubenswrapper[4695]: I0320 11:29:09.310588 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd"} Mar 20 11:29:09 crc kubenswrapper[4695]: I0320 11:29:09.310616 4695 scope.go:117] "RemoveContainer" containerID="4f610c401878857d4cef8a9377c20e48073892cd3a403b8cebff93bc8913af7a" Mar 20 11:29:10 crc kubenswrapper[4695]: I0320 11:29:10.322266 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerStarted","Data":"419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6"} Mar 20 11:29:12 crc kubenswrapper[4695]: I0320 11:29:12.097073 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:12 crc kubenswrapper[4695]: I0320 11:29:12.097870 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:12 crc kubenswrapper[4695]: I0320 11:29:12.160601 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:13 crc kubenswrapper[4695]: I0320 11:29:13.061441 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:13 crc kubenswrapper[4695]: I0320 11:29:13.861570 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxhzm"] Mar 20 11:29:14 crc kubenswrapper[4695]: I0320 11:29:14.350947 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hxhzm" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="registry-server" containerID="cri-o://4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644" gracePeriod=2 Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.369737 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.369754 4695 generic.go:334] "Generic (PLEG): container finished" podID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerID="419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6" exitCode=0 Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.369844 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerDied","Data":"419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6"} Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.376409 4695 generic.go:334] "Generic (PLEG): container finished" podID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerID="4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644" exitCode=0 Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.376475 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerDied","Data":"4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644"} Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.376521 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hxhzm" event={"ID":"41785d21-37b6-4226-a059-0fc062e4cfc2","Type":"ContainerDied","Data":"d642aabf67cf1e82ffd8ed88b59eea0b55bbbd9c530eb511e20c17d76a190874"} Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.376551 4695 scope.go:117] "RemoveContainer" containerID="4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.399308 4695 scope.go:117] "RemoveContainer" containerID="3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.450268 4695 scope.go:117] "RemoveContainer" containerID="b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.479699 4695 scope.go:117] "RemoveContainer" containerID="4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644" Mar 20 11:29:15 crc kubenswrapper[4695]: E0320 11:29:15.480694 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644\": container with ID starting with 4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644 not found: ID does not exist" containerID="4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.480752 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644"} err="failed to get container status \"4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644\": rpc error: code = NotFound desc = could not find container \"4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644\": container with ID starting with 4671857ee94ca8b720c8ed4e63ee1f208d26c3fb535b1679e94921da7676c644 not found: ID does not exist" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.480795 4695 scope.go:117] "RemoveContainer" containerID="3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a" Mar 20 11:29:15 crc kubenswrapper[4695]: E0320 11:29:15.481295 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a\": container with ID starting with 3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a not found: ID does not exist" containerID="3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.481350 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a"} err="failed to get container status \"3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a\": rpc error: code = NotFound desc = could not find container \"3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a\": container with ID starting with 3faf2507dfced0eb00af2c20278940f9b5c182f19944e33ab02f504fb3e92c3a not found: ID does not exist" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.481385 4695 scope.go:117] "RemoveContainer" containerID="b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084" Mar 20 11:29:15 crc kubenswrapper[4695]: E0320 11:29:15.481817 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084\": container with ID starting with b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084 not found: ID does not exist" containerID="b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.481858 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084"} err="failed to get container status \"b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084\": rpc error: code = NotFound desc = could not find container \"b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084\": container with ID starting with b73cd916ba7985e10c3481e9bb81b2f7ff5b0c1ab311eafa35c8b0d8bfe3c084 not found: ID does not exist" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.488662 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-utilities\") pod \"41785d21-37b6-4226-a059-0fc062e4cfc2\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.488781 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-catalog-content\") pod \"41785d21-37b6-4226-a059-0fc062e4cfc2\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.488964 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7zbp\" (UniqueName: \"kubernetes.io/projected/41785d21-37b6-4226-a059-0fc062e4cfc2-kube-api-access-r7zbp\") pod \"41785d21-37b6-4226-a059-0fc062e4cfc2\" (UID: \"41785d21-37b6-4226-a059-0fc062e4cfc2\") " Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.490308 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-utilities" (OuterVolumeSpecName: "utilities") pod "41785d21-37b6-4226-a059-0fc062e4cfc2" (UID: "41785d21-37b6-4226-a059-0fc062e4cfc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.503168 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41785d21-37b6-4226-a059-0fc062e4cfc2-kube-api-access-r7zbp" (OuterVolumeSpecName: "kube-api-access-r7zbp") pod "41785d21-37b6-4226-a059-0fc062e4cfc2" (UID: "41785d21-37b6-4226-a059-0fc062e4cfc2"). InnerVolumeSpecName "kube-api-access-r7zbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.555557 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41785d21-37b6-4226-a059-0fc062e4cfc2" (UID: "41785d21-37b6-4226-a059-0fc062e4cfc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.591122 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7zbp\" (UniqueName: \"kubernetes.io/projected/41785d21-37b6-4226-a059-0fc062e4cfc2-kube-api-access-r7zbp\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.591539 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:15 crc kubenswrapper[4695]: I0320 11:29:15.591558 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41785d21-37b6-4226-a059-0fc062e4cfc2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:16 crc kubenswrapper[4695]: I0320 11:29:16.386678 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hxhzm" Mar 20 11:29:16 crc kubenswrapper[4695]: I0320 11:29:16.418425 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hxhzm"] Mar 20 11:29:16 crc kubenswrapper[4695]: I0320 11:29:16.426452 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hxhzm"] Mar 20 11:29:16 crc kubenswrapper[4695]: I0320 11:29:16.900237 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" path="/var/lib/kubelet/pods/41785d21-37b6-4226-a059-0fc062e4cfc2/volumes" Mar 20 11:29:18 crc kubenswrapper[4695]: I0320 11:29:18.403476 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerStarted","Data":"f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863"} Mar 20 11:29:18 crc kubenswrapper[4695]: I0320 11:29:18.431686 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mvjkq" podStartSLOduration=3.367474573 podStartE2EDuration="12.431665411s" podCreationTimestamp="2026-03-20 11:29:06 +0000 UTC" firstStartedPulling="2026-03-20 11:29:08.303621024 +0000 UTC m=+2126.084226587" lastFinishedPulling="2026-03-20 11:29:17.367811862 +0000 UTC m=+2135.148417425" observedRunningTime="2026-03-20 11:29:18.42534455 +0000 UTC m=+2136.205950123" watchObservedRunningTime="2026-03-20 11:29:18.431665411 +0000 UTC m=+2136.212270974" Mar 20 11:29:27 crc kubenswrapper[4695]: I0320 11:29:27.286736 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:27 crc kubenswrapper[4695]: I0320 11:29:27.287452 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:27 crc kubenswrapper[4695]: I0320 11:29:27.330434 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:27 crc kubenswrapper[4695]: I0320 11:29:27.514998 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:29 crc kubenswrapper[4695]: I0320 11:29:29.753200 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvjkq"] Mar 20 11:29:29 crc kubenswrapper[4695]: I0320 11:29:29.753897 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mvjkq" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="registry-server" containerID="cri-o://f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863" gracePeriod=2 Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.184527 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.310669 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-catalog-content\") pod \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.310827 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-utilities\") pod \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.310864 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqxvk\" (UniqueName: \"kubernetes.io/projected/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-kube-api-access-pqxvk\") pod \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\" (UID: \"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a\") " Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.313812 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-utilities" (OuterVolumeSpecName: "utilities") pod "f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" (UID: "f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.339828 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-kube-api-access-pqxvk" (OuterVolumeSpecName: "kube-api-access-pqxvk") pod "f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" (UID: "f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a"). InnerVolumeSpecName "kube-api-access-pqxvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.412729 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.412775 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqxvk\" (UniqueName: \"kubernetes.io/projected/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-kube-api-access-pqxvk\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.478562 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" (UID: "f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.490217 4695 generic.go:334] "Generic (PLEG): container finished" podID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerID="f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863" exitCode=0 Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.490274 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerDied","Data":"f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863"} Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.490312 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mvjkq" event={"ID":"f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a","Type":"ContainerDied","Data":"7d6109fcffd59641b90cf84b436b69e57c53b328435171e05dcde17e76f04ae8"} Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.490333 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mvjkq" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.490335 4695 scope.go:117] "RemoveContainer" containerID="f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.514552 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.514660 4695 scope.go:117] "RemoveContainer" containerID="419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.531737 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mvjkq"] Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.535286 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mvjkq"] Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.549156 4695 scope.go:117] "RemoveContainer" containerID="6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.573996 4695 scope.go:117] "RemoveContainer" containerID="f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863" Mar 20 11:29:30 crc kubenswrapper[4695]: E0320 11:29:30.574641 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863\": container with ID starting with f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863 not found: ID does not exist" containerID="f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.574685 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863"} err="failed to get container status \"f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863\": rpc error: code = NotFound desc = could not find container \"f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863\": container with ID starting with f7f2e4d94e667e68024ab65bc2278d1a8524c92d035e164ff88231e94d99d863 not found: ID does not exist" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.574709 4695 scope.go:117] "RemoveContainer" containerID="419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6" Mar 20 11:29:30 crc kubenswrapper[4695]: E0320 11:29:30.575188 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6\": container with ID starting with 419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6 not found: ID does not exist" containerID="419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.575249 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6"} err="failed to get container status \"419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6\": rpc error: code = NotFound desc = could not find container \"419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6\": container with ID starting with 419ca6268b2bfdd6beabeeda5dbc0527c2f1e06a14e2842072684aa9964adaf6 not found: ID does not exist" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.575286 4695 scope.go:117] "RemoveContainer" containerID="6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69" Mar 20 11:29:30 crc kubenswrapper[4695]: E0320 11:29:30.575605 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69\": container with ID starting with 6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69 not found: ID does not exist" containerID="6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.575643 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69"} err="failed to get container status \"6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69\": rpc error: code = NotFound desc = could not find container \"6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69\": container with ID starting with 6e35c6fe15ec171c2391e76e6abf19c962af8350238cde256e2aa28ec00a1f69 not found: ID does not exist" Mar 20 11:29:30 crc kubenswrapper[4695]: I0320 11:29:30.895055 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" path="/var/lib/kubelet/pods/f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a/volumes" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.147039 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566770-vmtxw"] Mar 20 11:30:00 crc kubenswrapper[4695]: E0320 11:30:00.148380 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148397 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4695]: E0320 11:30:00.148416 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148423 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4695]: E0320 11:30:00.148438 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148445 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4695]: E0320 11:30:00.148454 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148460 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4695]: E0320 11:30:00.148474 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148480 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4695]: E0320 11:30:00.148496 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148504 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148662 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65ccb1a-a583-4c48-9efe-1c9ecbdcf38a" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.148683 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="41785d21-37b6-4226-a059-0fc062e4cfc2" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.149240 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.153851 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.154434 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.154692 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.161158 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-vmtxw"] Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.167745 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d"] Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.169050 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.174973 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.175288 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.175458 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d"] Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.256058 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9d6m\" (UniqueName: \"kubernetes.io/projected/b7167149-059f-4428-b098-132c908e398d-kube-api-access-v9d6m\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.256154 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26dtx\" (UniqueName: \"kubernetes.io/projected/56d23936-2917-4468-b9a2-ab9eb4cea0cf-kube-api-access-26dtx\") pod \"auto-csr-approver-29566770-vmtxw\" (UID: \"56d23936-2917-4468-b9a2-ab9eb4cea0cf\") " pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.256215 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7167149-059f-4428-b098-132c908e398d-config-volume\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.256241 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7167149-059f-4428-b098-132c908e398d-secret-volume\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.358112 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26dtx\" (UniqueName: \"kubernetes.io/projected/56d23936-2917-4468-b9a2-ab9eb4cea0cf-kube-api-access-26dtx\") pod \"auto-csr-approver-29566770-vmtxw\" (UID: \"56d23936-2917-4468-b9a2-ab9eb4cea0cf\") " pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.358196 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7167149-059f-4428-b098-132c908e398d-config-volume\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.358217 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7167149-059f-4428-b098-132c908e398d-secret-volume\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.358263 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9d6m\" (UniqueName: \"kubernetes.io/projected/b7167149-059f-4428-b098-132c908e398d-kube-api-access-v9d6m\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.359307 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7167149-059f-4428-b098-132c908e398d-config-volume\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.369300 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7167149-059f-4428-b098-132c908e398d-secret-volume\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.375332 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26dtx\" (UniqueName: \"kubernetes.io/projected/56d23936-2917-4468-b9a2-ab9eb4cea0cf-kube-api-access-26dtx\") pod \"auto-csr-approver-29566770-vmtxw\" (UID: \"56d23936-2917-4468-b9a2-ab9eb4cea0cf\") " pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.376136 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9d6m\" (UniqueName: \"kubernetes.io/projected/b7167149-059f-4428-b098-132c908e398d-kube-api-access-v9d6m\") pod \"collect-profiles-29566770-ld54d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.487933 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.498122 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.936456 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d"] Mar 20 11:30:00 crc kubenswrapper[4695]: W0320 11:30:00.988433 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod56d23936_2917_4468_b9a2_ab9eb4cea0cf.slice/crio-affe76246f9e62e198cf20cac82181620a4dc75b4a333c69b4253b8b9bd5b100 WatchSource:0}: Error finding container affe76246f9e62e198cf20cac82181620a4dc75b4a333c69b4253b8b9bd5b100: Status 404 returned error can't find the container with id affe76246f9e62e198cf20cac82181620a4dc75b4a333c69b4253b8b9bd5b100 Mar 20 11:30:00 crc kubenswrapper[4695]: I0320 11:30:00.988626 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-vmtxw"] Mar 20 11:30:01 crc kubenswrapper[4695]: I0320 11:30:01.245268 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" event={"ID":"b7167149-059f-4428-b098-132c908e398d","Type":"ContainerStarted","Data":"d285270a98aa88577ec2f29a4dbb5c0783e0c3540cd8e77f2f0dbd5aeba5b4a1"} Mar 20 11:30:01 crc kubenswrapper[4695]: I0320 11:30:01.245350 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" event={"ID":"b7167149-059f-4428-b098-132c908e398d","Type":"ContainerStarted","Data":"83e66bf45d540b71d3bd144818fa531150ba3819c9aced34cd4c3a8c9c83b5e3"} Mar 20 11:30:01 crc kubenswrapper[4695]: I0320 11:30:01.247107 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" event={"ID":"56d23936-2917-4468-b9a2-ab9eb4cea0cf","Type":"ContainerStarted","Data":"affe76246f9e62e198cf20cac82181620a4dc75b4a333c69b4253b8b9bd5b100"} Mar 20 11:30:01 crc kubenswrapper[4695]: I0320 11:30:01.269514 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" podStartSLOduration=1.269496074 podStartE2EDuration="1.269496074s" podCreationTimestamp="2026-03-20 11:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:30:01.26426928 +0000 UTC m=+2179.044874843" watchObservedRunningTime="2026-03-20 11:30:01.269496074 +0000 UTC m=+2179.050101637" Mar 20 11:30:02 crc kubenswrapper[4695]: I0320 11:30:02.263762 4695 generic.go:334] "Generic (PLEG): container finished" podID="b7167149-059f-4428-b098-132c908e398d" containerID="d285270a98aa88577ec2f29a4dbb5c0783e0c3540cd8e77f2f0dbd5aeba5b4a1" exitCode=0 Mar 20 11:30:02 crc kubenswrapper[4695]: I0320 11:30:02.263839 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" event={"ID":"b7167149-059f-4428-b098-132c908e398d","Type":"ContainerDied","Data":"d285270a98aa88577ec2f29a4dbb5c0783e0c3540cd8e77f2f0dbd5aeba5b4a1"} Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.277173 4695 generic.go:334] "Generic (PLEG): container finished" podID="56d23936-2917-4468-b9a2-ab9eb4cea0cf" containerID="3b2bf591e1cec82c24bf3f85332f2a58de4652db80000addab9d7258253dc6d3" exitCode=0 Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.277429 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" event={"ID":"56d23936-2917-4468-b9a2-ab9eb4cea0cf","Type":"ContainerDied","Data":"3b2bf591e1cec82c24bf3f85332f2a58de4652db80000addab9d7258253dc6d3"} Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.568988 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.611556 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9d6m\" (UniqueName: \"kubernetes.io/projected/b7167149-059f-4428-b098-132c908e398d-kube-api-access-v9d6m\") pod \"b7167149-059f-4428-b098-132c908e398d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.611617 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7167149-059f-4428-b098-132c908e398d-config-volume\") pod \"b7167149-059f-4428-b098-132c908e398d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.611654 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7167149-059f-4428-b098-132c908e398d-secret-volume\") pod \"b7167149-059f-4428-b098-132c908e398d\" (UID: \"b7167149-059f-4428-b098-132c908e398d\") " Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.612847 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7167149-059f-4428-b098-132c908e398d-config-volume" (OuterVolumeSpecName: "config-volume") pod "b7167149-059f-4428-b098-132c908e398d" (UID: "b7167149-059f-4428-b098-132c908e398d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.618321 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7167149-059f-4428-b098-132c908e398d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b7167149-059f-4428-b098-132c908e398d" (UID: "b7167149-059f-4428-b098-132c908e398d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.618538 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7167149-059f-4428-b098-132c908e398d-kube-api-access-v9d6m" (OuterVolumeSpecName: "kube-api-access-v9d6m") pod "b7167149-059f-4428-b098-132c908e398d" (UID: "b7167149-059f-4428-b098-132c908e398d"). InnerVolumeSpecName "kube-api-access-v9d6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.713360 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9d6m\" (UniqueName: \"kubernetes.io/projected/b7167149-059f-4428-b098-132c908e398d-kube-api-access-v9d6m\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.713405 4695 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b7167149-059f-4428-b098-132c908e398d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4695]: I0320 11:30:03.713415 4695 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b7167149-059f-4428-b098-132c908e398d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.294801 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.298494 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-ld54d" event={"ID":"b7167149-059f-4428-b098-132c908e398d","Type":"ContainerDied","Data":"83e66bf45d540b71d3bd144818fa531150ba3819c9aced34cd4c3a8c9c83b5e3"} Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.298692 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e66bf45d540b71d3bd144818fa531150ba3819c9aced34cd4c3a8c9c83b5e3" Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.350875 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2"] Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.359143 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-qt6v2"] Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.576678 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.627826 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26dtx\" (UniqueName: \"kubernetes.io/projected/56d23936-2917-4468-b9a2-ab9eb4cea0cf-kube-api-access-26dtx\") pod \"56d23936-2917-4468-b9a2-ab9eb4cea0cf\" (UID: \"56d23936-2917-4468-b9a2-ab9eb4cea0cf\") " Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.633587 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d23936-2917-4468-b9a2-ab9eb4cea0cf-kube-api-access-26dtx" (OuterVolumeSpecName: "kube-api-access-26dtx") pod "56d23936-2917-4468-b9a2-ab9eb4cea0cf" (UID: "56d23936-2917-4468-b9a2-ab9eb4cea0cf"). InnerVolumeSpecName "kube-api-access-26dtx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.729814 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26dtx\" (UniqueName: \"kubernetes.io/projected/56d23936-2917-4468-b9a2-ab9eb4cea0cf-kube-api-access-26dtx\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4695]: I0320 11:30:04.895328 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eafc357-18f4-49a8-88be-d7e67ed800a0" path="/var/lib/kubelet/pods/2eafc357-18f4-49a8-88be-d7e67ed800a0/volumes" Mar 20 11:30:05 crc kubenswrapper[4695]: I0320 11:30:05.306467 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" event={"ID":"56d23936-2917-4468-b9a2-ab9eb4cea0cf","Type":"ContainerDied","Data":"affe76246f9e62e198cf20cac82181620a4dc75b4a333c69b4253b8b9bd5b100"} Mar 20 11:30:05 crc kubenswrapper[4695]: I0320 11:30:05.308475 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="affe76246f9e62e198cf20cac82181620a4dc75b4a333c69b4253b8b9bd5b100" Mar 20 11:30:05 crc kubenswrapper[4695]: I0320 11:30:05.306523 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-vmtxw" Mar 20 11:30:05 crc kubenswrapper[4695]: I0320 11:30:05.632356 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-gwbpr"] Mar 20 11:30:05 crc kubenswrapper[4695]: I0320 11:30:05.640546 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-gwbpr"] Mar 20 11:30:06 crc kubenswrapper[4695]: I0320 11:30:06.514874 4695 scope.go:117] "RemoveContainer" containerID="91585b1f21b2154949883fb06e2da716982e430c256e2e5f55cf591cbbf69ba1" Mar 20 11:30:06 crc kubenswrapper[4695]: I0320 11:30:06.898192 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30de059-4384-4d9b-bffe-7f0133eb3c2f" path="/var/lib/kubelet/pods/d30de059-4384-4d9b-bffe-7f0133eb3c2f/volumes" Mar 20 11:31:06 crc kubenswrapper[4695]: I0320 11:31:06.592549 4695 scope.go:117] "RemoveContainer" containerID="a970e576a05b9bd8dd9d5e1ad2049639edcd0b88c8962334a9b07092c5ba3d03" Mar 20 11:31:08 crc kubenswrapper[4695]: I0320 11:31:08.431251 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:31:08 crc kubenswrapper[4695]: I0320 11:31:08.431639 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.591248 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5rbh8"] Mar 20 11:31:23 crc kubenswrapper[4695]: E0320 11:31:23.592395 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d23936-2917-4468-b9a2-ab9eb4cea0cf" containerName="oc" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.592504 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d23936-2917-4468-b9a2-ab9eb4cea0cf" containerName="oc" Mar 20 11:31:23 crc kubenswrapper[4695]: E0320 11:31:23.592555 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7167149-059f-4428-b098-132c908e398d" containerName="collect-profiles" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.592569 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7167149-059f-4428-b098-132c908e398d" containerName="collect-profiles" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.592804 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7167149-059f-4428-b098-132c908e398d" containerName="collect-profiles" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.592821 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d23936-2917-4468-b9a2-ab9eb4cea0cf" containerName="oc" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.594166 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.615813 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5rbh8"] Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.859111 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-utilities\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.859172 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-catalog-content\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.859212 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6ck8\" (UniqueName: \"kubernetes.io/projected/5b121109-931b-4d2d-ad39-0f7c3bb5180b-kube-api-access-k6ck8\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.963815 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-utilities\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.963901 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-catalog-content\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.964488 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-catalog-content\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.964578 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6ck8\" (UniqueName: \"kubernetes.io/projected/5b121109-931b-4d2d-ad39-0f7c3bb5180b-kube-api-access-k6ck8\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:23 crc kubenswrapper[4695]: I0320 11:31:23.965749 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-utilities\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:24 crc kubenswrapper[4695]: I0320 11:31:24.012942 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6ck8\" (UniqueName: \"kubernetes.io/projected/5b121109-931b-4d2d-ad39-0f7c3bb5180b-kube-api-access-k6ck8\") pod \"community-operators-5rbh8\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:24 crc kubenswrapper[4695]: I0320 11:31:24.220242 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:25 crc kubenswrapper[4695]: I0320 11:31:25.010183 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5rbh8"] Mar 20 11:31:26 crc kubenswrapper[4695]: I0320 11:31:26.310295 4695 generic.go:334] "Generic (PLEG): container finished" podID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerID="c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190" exitCode=0 Mar 20 11:31:26 crc kubenswrapper[4695]: I0320 11:31:26.312612 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rbh8" event={"ID":"5b121109-931b-4d2d-ad39-0f7c3bb5180b","Type":"ContainerDied","Data":"c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190"} Mar 20 11:31:26 crc kubenswrapper[4695]: I0320 11:31:26.318750 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rbh8" event={"ID":"5b121109-931b-4d2d-ad39-0f7c3bb5180b","Type":"ContainerStarted","Data":"31c8f18e265218c1eaa94932255940da8e94edc4178a0437444c3e15fcb94612"} Mar 20 11:31:26 crc kubenswrapper[4695]: I0320 11:31:26.319234 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:31:28 crc kubenswrapper[4695]: I0320 11:31:28.394239 4695 generic.go:334] "Generic (PLEG): container finished" podID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerID="b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c" exitCode=0 Mar 20 11:31:28 crc kubenswrapper[4695]: I0320 11:31:28.394560 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rbh8" event={"ID":"5b121109-931b-4d2d-ad39-0f7c3bb5180b","Type":"ContainerDied","Data":"b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c"} Mar 20 11:31:30 crc kubenswrapper[4695]: I0320 11:31:30.419268 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rbh8" event={"ID":"5b121109-931b-4d2d-ad39-0f7c3bb5180b","Type":"ContainerStarted","Data":"cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516"} Mar 20 11:31:30 crc kubenswrapper[4695]: I0320 11:31:30.443800 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5rbh8" podStartSLOduration=3.882272087 podStartE2EDuration="7.443775437s" podCreationTimestamp="2026-03-20 11:31:23 +0000 UTC" firstStartedPulling="2026-03-20 11:31:26.318720913 +0000 UTC m=+2264.099326486" lastFinishedPulling="2026-03-20 11:31:29.880224273 +0000 UTC m=+2267.660829836" observedRunningTime="2026-03-20 11:31:30.436277146 +0000 UTC m=+2268.216882739" watchObservedRunningTime="2026-03-20 11:31:30.443775437 +0000 UTC m=+2268.224381000" Mar 20 11:31:34 crc kubenswrapper[4695]: I0320 11:31:34.221074 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:34 crc kubenswrapper[4695]: I0320 11:31:34.221445 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:34 crc kubenswrapper[4695]: I0320 11:31:34.311516 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:38 crc kubenswrapper[4695]: I0320 11:31:38.430552 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:31:38 crc kubenswrapper[4695]: I0320 11:31:38.431034 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:31:44 crc kubenswrapper[4695]: I0320 11:31:44.295688 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:44 crc kubenswrapper[4695]: I0320 11:31:44.347342 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5rbh8"] Mar 20 11:31:44 crc kubenswrapper[4695]: I0320 11:31:44.722960 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5rbh8" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="registry-server" containerID="cri-o://cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516" gracePeriod=2 Mar 20 11:31:45 crc kubenswrapper[4695]: I0320 11:31:45.130463 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:45 crc kubenswrapper[4695]: I0320 11:31:45.973091 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6ck8\" (UniqueName: \"kubernetes.io/projected/5b121109-931b-4d2d-ad39-0f7c3bb5180b-kube-api-access-k6ck8\") pod \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " Mar 20 11:31:45 crc kubenswrapper[4695]: I0320 11:31:45.973164 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-utilities\") pod \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " Mar 20 11:31:45 crc kubenswrapper[4695]: I0320 11:31:45.973291 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-catalog-content\") pod \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\" (UID: \"5b121109-931b-4d2d-ad39-0f7c3bb5180b\") " Mar 20 11:31:45 crc kubenswrapper[4695]: I0320 11:31:45.974317 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-utilities" (OuterVolumeSpecName: "utilities") pod "5b121109-931b-4d2d-ad39-0f7c3bb5180b" (UID: "5b121109-931b-4d2d-ad39-0f7c3bb5180b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:45 crc kubenswrapper[4695]: I0320 11:31:45.997404 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b121109-931b-4d2d-ad39-0f7c3bb5180b-kube-api-access-k6ck8" (OuterVolumeSpecName: "kube-api-access-k6ck8") pod "5b121109-931b-4d2d-ad39-0f7c3bb5180b" (UID: "5b121109-931b-4d2d-ad39-0f7c3bb5180b"). InnerVolumeSpecName "kube-api-access-k6ck8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.011106 4695 generic.go:334] "Generic (PLEG): container finished" podID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerID="cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516" exitCode=0 Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.011159 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rbh8" event={"ID":"5b121109-931b-4d2d-ad39-0f7c3bb5180b","Type":"ContainerDied","Data":"cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516"} Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.011192 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5rbh8" event={"ID":"5b121109-931b-4d2d-ad39-0f7c3bb5180b","Type":"ContainerDied","Data":"31c8f18e265218c1eaa94932255940da8e94edc4178a0437444c3e15fcb94612"} Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.011215 4695 scope.go:117] "RemoveContainer" containerID="cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.011364 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5rbh8" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.034922 4695 scope.go:117] "RemoveContainer" containerID="b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.058141 4695 scope.go:117] "RemoveContainer" containerID="c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.059833 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5b121109-931b-4d2d-ad39-0f7c3bb5180b" (UID: "5b121109-931b-4d2d-ad39-0f7c3bb5180b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.085630 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6ck8\" (UniqueName: \"kubernetes.io/projected/5b121109-931b-4d2d-ad39-0f7c3bb5180b-kube-api-access-k6ck8\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.085666 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.085679 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5b121109-931b-4d2d-ad39-0f7c3bb5180b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.100354 4695 scope.go:117] "RemoveContainer" containerID="cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516" Mar 20 11:31:46 crc kubenswrapper[4695]: E0320 11:31:46.101078 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516\": container with ID starting with cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516 not found: ID does not exist" containerID="cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.101166 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516"} err="failed to get container status \"cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516\": rpc error: code = NotFound desc = could not find container \"cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516\": container with ID starting with cb1333c2f229c89d275daa6bb22d08c9ef3a1346875274300272f7f38d377516 not found: ID does not exist" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.101210 4695 scope.go:117] "RemoveContainer" containerID="b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c" Mar 20 11:31:46 crc kubenswrapper[4695]: E0320 11:31:46.101829 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c\": container with ID starting with b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c not found: ID does not exist" containerID="b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.101892 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c"} err="failed to get container status \"b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c\": rpc error: code = NotFound desc = could not find container \"b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c\": container with ID starting with b4a04a40c96136308fc6ee5edcc8190c50d173eb7eb0078c1293d8a03110a17c not found: ID does not exist" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.101947 4695 scope.go:117] "RemoveContainer" containerID="c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190" Mar 20 11:31:46 crc kubenswrapper[4695]: E0320 11:31:46.102382 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190\": container with ID starting with c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190 not found: ID does not exist" containerID="c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.102437 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190"} err="failed to get container status \"c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190\": rpc error: code = NotFound desc = could not find container \"c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190\": container with ID starting with c14fce03576118d8944e9cfb3f72c8e8f633cd1757454adaf668e2321907e190 not found: ID does not exist" Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.345984 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5rbh8"] Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.356178 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5rbh8"] Mar 20 11:31:46 crc kubenswrapper[4695]: I0320 11:31:46.897591 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" path="/var/lib/kubelet/pods/5b121109-931b-4d2d-ad39-0f7c3bb5180b/volumes" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.143698 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566772-qhj7r"] Mar 20 11:32:00 crc kubenswrapper[4695]: E0320 11:32:00.144865 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.144883 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4695]: E0320 11:32:00.144928 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.144936 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4695]: E0320 11:32:00.144949 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.144959 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.145125 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b121109-931b-4d2d-ad39-0f7c3bb5180b" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.145743 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.149210 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.149666 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.150627 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.159844 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-qhj7r"] Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.304730 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhpr9\" (UniqueName: \"kubernetes.io/projected/4110e916-7822-40e8-951c-1b6a864a0d2f-kube-api-access-vhpr9\") pod \"auto-csr-approver-29566772-qhj7r\" (UID: \"4110e916-7822-40e8-951c-1b6a864a0d2f\") " pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.407268 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhpr9\" (UniqueName: \"kubernetes.io/projected/4110e916-7822-40e8-951c-1b6a864a0d2f-kube-api-access-vhpr9\") pod \"auto-csr-approver-29566772-qhj7r\" (UID: \"4110e916-7822-40e8-951c-1b6a864a0d2f\") " pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.442330 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhpr9\" (UniqueName: \"kubernetes.io/projected/4110e916-7822-40e8-951c-1b6a864a0d2f-kube-api-access-vhpr9\") pod \"auto-csr-approver-29566772-qhj7r\" (UID: \"4110e916-7822-40e8-951c-1b6a864a0d2f\") " pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.472899 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:00 crc kubenswrapper[4695]: I0320 11:32:00.947322 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-qhj7r"] Mar 20 11:32:01 crc kubenswrapper[4695]: I0320 11:32:01.322379 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" event={"ID":"4110e916-7822-40e8-951c-1b6a864a0d2f","Type":"ContainerStarted","Data":"3f0441e8ac66347f60fb0e4134b3b20b9cba304f1db5fcb9db73659fe07dc66a"} Mar 20 11:32:02 crc kubenswrapper[4695]: I0320 11:32:02.332485 4695 generic.go:334] "Generic (PLEG): container finished" podID="4110e916-7822-40e8-951c-1b6a864a0d2f" containerID="95bbe1a8ef45182ae69755ff696f95610c656c38f46b6f9588960c02eb3b944b" exitCode=0 Mar 20 11:32:02 crc kubenswrapper[4695]: I0320 11:32:02.332586 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" event={"ID":"4110e916-7822-40e8-951c-1b6a864a0d2f","Type":"ContainerDied","Data":"95bbe1a8ef45182ae69755ff696f95610c656c38f46b6f9588960c02eb3b944b"} Mar 20 11:32:03 crc kubenswrapper[4695]: I0320 11:32:03.632196 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:03 crc kubenswrapper[4695]: I0320 11:32:03.762285 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhpr9\" (UniqueName: \"kubernetes.io/projected/4110e916-7822-40e8-951c-1b6a864a0d2f-kube-api-access-vhpr9\") pod \"4110e916-7822-40e8-951c-1b6a864a0d2f\" (UID: \"4110e916-7822-40e8-951c-1b6a864a0d2f\") " Mar 20 11:32:03 crc kubenswrapper[4695]: I0320 11:32:03.769410 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4110e916-7822-40e8-951c-1b6a864a0d2f-kube-api-access-vhpr9" (OuterVolumeSpecName: "kube-api-access-vhpr9") pod "4110e916-7822-40e8-951c-1b6a864a0d2f" (UID: "4110e916-7822-40e8-951c-1b6a864a0d2f"). InnerVolumeSpecName "kube-api-access-vhpr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:03 crc kubenswrapper[4695]: I0320 11:32:03.863798 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhpr9\" (UniqueName: \"kubernetes.io/projected/4110e916-7822-40e8-951c-1b6a864a0d2f-kube-api-access-vhpr9\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:04 crc kubenswrapper[4695]: I0320 11:32:04.350755 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" event={"ID":"4110e916-7822-40e8-951c-1b6a864a0d2f","Type":"ContainerDied","Data":"3f0441e8ac66347f60fb0e4134b3b20b9cba304f1db5fcb9db73659fe07dc66a"} Mar 20 11:32:04 crc kubenswrapper[4695]: I0320 11:32:04.350807 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f0441e8ac66347f60fb0e4134b3b20b9cba304f1db5fcb9db73659fe07dc66a" Mar 20 11:32:04 crc kubenswrapper[4695]: I0320 11:32:04.350829 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-qhj7r" Mar 20 11:32:04 crc kubenswrapper[4695]: I0320 11:32:04.708098 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lg4t7"] Mar 20 11:32:04 crc kubenswrapper[4695]: I0320 11:32:04.714701 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lg4t7"] Mar 20 11:32:04 crc kubenswrapper[4695]: I0320 11:32:04.895273 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5539e8e1-a981-4038-8f12-530a9528dbb2" path="/var/lib/kubelet/pods/5539e8e1-a981-4038-8f12-530a9528dbb2/volumes" Mar 20 11:32:06 crc kubenswrapper[4695]: I0320 11:32:06.666087 4695 scope.go:117] "RemoveContainer" containerID="bf1453b540dd334fcc4dd2d405696be52af6ee20cc21d5d5c5ae3f40a8cea434" Mar 20 11:32:08 crc kubenswrapper[4695]: I0320 11:32:08.430439 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:32:08 crc kubenswrapper[4695]: I0320 11:32:08.430504 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:08 crc kubenswrapper[4695]: I0320 11:32:08.430553 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:32:08 crc kubenswrapper[4695]: I0320 11:32:08.431231 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:32:08 crc kubenswrapper[4695]: I0320 11:32:08.431286 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" gracePeriod=600 Mar 20 11:32:08 crc kubenswrapper[4695]: E0320 11:32:08.560101 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:32:09 crc kubenswrapper[4695]: I0320 11:32:09.389196 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" exitCode=0 Mar 20 11:32:09 crc kubenswrapper[4695]: I0320 11:32:09.389253 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd"} Mar 20 11:32:09 crc kubenswrapper[4695]: I0320 11:32:09.390127 4695 scope.go:117] "RemoveContainer" containerID="9ebbe35b81a4e25aff4f5292672403a71d315b60bc3a2d22e857c4f0326ec31a" Mar 20 11:32:09 crc kubenswrapper[4695]: I0320 11:32:09.390684 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:32:09 crc kubenswrapper[4695]: E0320 11:32:09.390949 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:32:22 crc kubenswrapper[4695]: I0320 11:32:22.897258 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:32:22 crc kubenswrapper[4695]: E0320 11:32:22.898283 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:32:34 crc kubenswrapper[4695]: I0320 11:32:34.887325 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:32:34 crc kubenswrapper[4695]: E0320 11:32:34.888147 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:32:46 crc kubenswrapper[4695]: I0320 11:32:46.942788 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:32:46 crc kubenswrapper[4695]: E0320 11:32:46.943810 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:33:00 crc kubenswrapper[4695]: I0320 11:33:00.887113 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:33:00 crc kubenswrapper[4695]: E0320 11:33:00.888403 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:33:12 crc kubenswrapper[4695]: I0320 11:33:12.890233 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:33:12 crc kubenswrapper[4695]: E0320 11:33:12.891140 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:33:23 crc kubenswrapper[4695]: I0320 11:33:23.887509 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:33:23 crc kubenswrapper[4695]: E0320 11:33:23.888544 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:33:36 crc kubenswrapper[4695]: I0320 11:33:36.887315 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:33:36 crc kubenswrapper[4695]: E0320 11:33:36.888563 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:33:50 crc kubenswrapper[4695]: I0320 11:33:50.887099 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:33:50 crc kubenswrapper[4695]: E0320 11:33:50.888009 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.362372 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bh2hz"] Mar 20 11:33:59 crc kubenswrapper[4695]: E0320 11:33:59.365006 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4110e916-7822-40e8-951c-1b6a864a0d2f" containerName="oc" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.365043 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="4110e916-7822-40e8-951c-1b6a864a0d2f" containerName="oc" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.365195 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="4110e916-7822-40e8-951c-1b6a864a0d2f" containerName="oc" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.366261 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.393918 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bh2hz"] Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.416968 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-catalog-content\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.417072 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-utilities\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.417147 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzq8\" (UniqueName: \"kubernetes.io/projected/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-kube-api-access-jvzq8\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.518276 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzq8\" (UniqueName: \"kubernetes.io/projected/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-kube-api-access-jvzq8\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.518410 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-catalog-content\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.518442 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-utilities\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.519171 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-utilities\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.519452 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-catalog-content\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.542858 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzq8\" (UniqueName: \"kubernetes.io/projected/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-kube-api-access-jvzq8\") pod \"redhat-marketplace-bh2hz\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:33:59 crc kubenswrapper[4695]: I0320 11:33:59.710297 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.164234 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566774-4xqkj"] Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.165813 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.168962 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.169234 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.176940 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.179427 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-4xqkj"] Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.228291 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bh2hz"] Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.338988 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2rrf\" (UniqueName: \"kubernetes.io/projected/10b65570-3332-484b-8878-ad33a5716725-kube-api-access-r2rrf\") pod \"auto-csr-approver-29566774-4xqkj\" (UID: \"10b65570-3332-484b-8878-ad33a5716725\") " pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.440840 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2rrf\" (UniqueName: \"kubernetes.io/projected/10b65570-3332-484b-8878-ad33a5716725-kube-api-access-r2rrf\") pod \"auto-csr-approver-29566774-4xqkj\" (UID: \"10b65570-3332-484b-8878-ad33a5716725\") " pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.465111 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2rrf\" (UniqueName: \"kubernetes.io/projected/10b65570-3332-484b-8878-ad33a5716725-kube-api-access-r2rrf\") pod \"auto-csr-approver-29566774-4xqkj\" (UID: \"10b65570-3332-484b-8878-ad33a5716725\") " pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.471090 4695 generic.go:334] "Generic (PLEG): container finished" podID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerID="32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e" exitCode=0 Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.471183 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bh2hz" event={"ID":"f7130df8-d5de-4a52-a4f4-8ace5b8572f6","Type":"ContainerDied","Data":"32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e"} Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.471223 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bh2hz" event={"ID":"f7130df8-d5de-4a52-a4f4-8ace5b8572f6","Type":"ContainerStarted","Data":"6d8620fb226ff5bf1ee568f36353c7c180cedf3bcc4cff98bc2006a62ba04b33"} Mar 20 11:34:00 crc kubenswrapper[4695]: I0320 11:34:00.497767 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:01 crc kubenswrapper[4695]: I0320 11:34:01.459882 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-4xqkj"] Mar 20 11:34:01 crc kubenswrapper[4695]: W0320 11:34:01.486480 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10b65570_3332_484b_8878_ad33a5716725.slice/crio-c17f4c94118d1ed03ac88ef840909bb5df7af2b1087cb7576ff7806f1f55b43c WatchSource:0}: Error finding container c17f4c94118d1ed03ac88ef840909bb5df7af2b1087cb7576ff7806f1f55b43c: Status 404 returned error can't find the container with id c17f4c94118d1ed03ac88ef840909bb5df7af2b1087cb7576ff7806f1f55b43c Mar 20 11:34:01 crc kubenswrapper[4695]: I0320 11:34:01.887648 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:34:01 crc kubenswrapper[4695]: E0320 11:34:01.887907 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:34:02 crc kubenswrapper[4695]: I0320 11:34:02.491735 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" event={"ID":"10b65570-3332-484b-8878-ad33a5716725","Type":"ContainerStarted","Data":"c17f4c94118d1ed03ac88ef840909bb5df7af2b1087cb7576ff7806f1f55b43c"} Mar 20 11:34:02 crc kubenswrapper[4695]: I0320 11:34:02.494432 4695 generic.go:334] "Generic (PLEG): container finished" podID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerID="41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba" exitCode=0 Mar 20 11:34:02 crc kubenswrapper[4695]: I0320 11:34:02.494464 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bh2hz" event={"ID":"f7130df8-d5de-4a52-a4f4-8ace5b8572f6","Type":"ContainerDied","Data":"41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba"} Mar 20 11:34:03 crc kubenswrapper[4695]: I0320 11:34:03.523669 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bh2hz" event={"ID":"f7130df8-d5de-4a52-a4f4-8ace5b8572f6","Type":"ContainerStarted","Data":"1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a"} Mar 20 11:34:03 crc kubenswrapper[4695]: I0320 11:34:03.530412 4695 generic.go:334] "Generic (PLEG): container finished" podID="10b65570-3332-484b-8878-ad33a5716725" containerID="0939349a9e09c57aeb3da6ecab4682362e1799e4d306026f942bc427de6c773c" exitCode=0 Mar 20 11:34:03 crc kubenswrapper[4695]: I0320 11:34:03.530572 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" event={"ID":"10b65570-3332-484b-8878-ad33a5716725","Type":"ContainerDied","Data":"0939349a9e09c57aeb3da6ecab4682362e1799e4d306026f942bc427de6c773c"} Mar 20 11:34:03 crc kubenswrapper[4695]: I0320 11:34:03.557813 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bh2hz" podStartSLOduration=2.162389019 podStartE2EDuration="4.557778029s" podCreationTimestamp="2026-03-20 11:33:59 +0000 UTC" firstStartedPulling="2026-03-20 11:34:00.473554174 +0000 UTC m=+2418.254159747" lastFinishedPulling="2026-03-20 11:34:02.868943194 +0000 UTC m=+2420.649548757" observedRunningTime="2026-03-20 11:34:03.551520849 +0000 UTC m=+2421.332126432" watchObservedRunningTime="2026-03-20 11:34:03.557778029 +0000 UTC m=+2421.338383612" Mar 20 11:34:04 crc kubenswrapper[4695]: I0320 11:34:04.864504 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.023937 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2rrf\" (UniqueName: \"kubernetes.io/projected/10b65570-3332-484b-8878-ad33a5716725-kube-api-access-r2rrf\") pod \"10b65570-3332-484b-8878-ad33a5716725\" (UID: \"10b65570-3332-484b-8878-ad33a5716725\") " Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.035263 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10b65570-3332-484b-8878-ad33a5716725-kube-api-access-r2rrf" (OuterVolumeSpecName: "kube-api-access-r2rrf") pod "10b65570-3332-484b-8878-ad33a5716725" (UID: "10b65570-3332-484b-8878-ad33a5716725"). InnerVolumeSpecName "kube-api-access-r2rrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.125810 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2rrf\" (UniqueName: \"kubernetes.io/projected/10b65570-3332-484b-8878-ad33a5716725-kube-api-access-r2rrf\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.548658 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" event={"ID":"10b65570-3332-484b-8878-ad33a5716725","Type":"ContainerDied","Data":"c17f4c94118d1ed03ac88ef840909bb5df7af2b1087cb7576ff7806f1f55b43c"} Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.549062 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c17f4c94118d1ed03ac88ef840909bb5df7af2b1087cb7576ff7806f1f55b43c" Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.548734 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-4xqkj" Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.957990 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r275f"] Mar 20 11:34:05 crc kubenswrapper[4695]: I0320 11:34:05.966769 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-r275f"] Mar 20 11:34:06 crc kubenswrapper[4695]: I0320 11:34:06.894862 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b99841f-5ced-44a9-baff-04c5e75c5fb0" path="/var/lib/kubelet/pods/0b99841f-5ced-44a9-baff-04c5e75c5fb0/volumes" Mar 20 11:34:09 crc kubenswrapper[4695]: I0320 11:34:09.711128 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:09 crc kubenswrapper[4695]: I0320 11:34:09.711366 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:09 crc kubenswrapper[4695]: I0320 11:34:09.757988 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:10 crc kubenswrapper[4695]: I0320 11:34:10.625219 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:10 crc kubenswrapper[4695]: I0320 11:34:10.673070 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bh2hz"] Mar 20 11:34:12 crc kubenswrapper[4695]: I0320 11:34:12.616569 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bh2hz" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="registry-server" containerID="cri-o://1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a" gracePeriod=2 Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.022973 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.068232 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvzq8\" (UniqueName: \"kubernetes.io/projected/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-kube-api-access-jvzq8\") pod \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.068392 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-catalog-content\") pod \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.068518 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-utilities\") pod \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\" (UID: \"f7130df8-d5de-4a52-a4f4-8ace5b8572f6\") " Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.070547 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-utilities" (OuterVolumeSpecName: "utilities") pod "f7130df8-d5de-4a52-a4f4-8ace5b8572f6" (UID: "f7130df8-d5de-4a52-a4f4-8ace5b8572f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.079665 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-kube-api-access-jvzq8" (OuterVolumeSpecName: "kube-api-access-jvzq8") pod "f7130df8-d5de-4a52-a4f4-8ace5b8572f6" (UID: "f7130df8-d5de-4a52-a4f4-8ace5b8572f6"). InnerVolumeSpecName "kube-api-access-jvzq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.098256 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7130df8-d5de-4a52-a4f4-8ace5b8572f6" (UID: "f7130df8-d5de-4a52-a4f4-8ace5b8572f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.170899 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvzq8\" (UniqueName: \"kubernetes.io/projected/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-kube-api-access-jvzq8\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.170996 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.171008 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7130df8-d5de-4a52-a4f4-8ace5b8572f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.627440 4695 generic.go:334] "Generic (PLEG): container finished" podID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerID="1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a" exitCode=0 Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.627508 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bh2hz" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.627509 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bh2hz" event={"ID":"f7130df8-d5de-4a52-a4f4-8ace5b8572f6","Type":"ContainerDied","Data":"1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a"} Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.628600 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bh2hz" event={"ID":"f7130df8-d5de-4a52-a4f4-8ace5b8572f6","Type":"ContainerDied","Data":"6d8620fb226ff5bf1ee568f36353c7c180cedf3bcc4cff98bc2006a62ba04b33"} Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.628622 4695 scope.go:117] "RemoveContainer" containerID="1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.660353 4695 scope.go:117] "RemoveContainer" containerID="41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.678591 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bh2hz"] Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.686596 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bh2hz"] Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.687057 4695 scope.go:117] "RemoveContainer" containerID="32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.707752 4695 scope.go:117] "RemoveContainer" containerID="1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a" Mar 20 11:34:13 crc kubenswrapper[4695]: E0320 11:34:13.710814 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a\": container with ID starting with 1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a not found: ID does not exist" containerID="1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.710863 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a"} err="failed to get container status \"1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a\": rpc error: code = NotFound desc = could not find container \"1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a\": container with ID starting with 1f1778cee5a244c6f5948fdb049ff8a951bb2e1f02a14134394106988010884a not found: ID does not exist" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.710895 4695 scope.go:117] "RemoveContainer" containerID="41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba" Mar 20 11:34:13 crc kubenswrapper[4695]: E0320 11:34:13.711429 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba\": container with ID starting with 41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba not found: ID does not exist" containerID="41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.711459 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba"} err="failed to get container status \"41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba\": rpc error: code = NotFound desc = could not find container \"41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba\": container with ID starting with 41e6b2f330ca1e21035fc09cf3c3372b1f40eb246c116be79299e48fe4e13eba not found: ID does not exist" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.711475 4695 scope.go:117] "RemoveContainer" containerID="32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e" Mar 20 11:34:13 crc kubenswrapper[4695]: E0320 11:34:13.712277 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e\": container with ID starting with 32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e not found: ID does not exist" containerID="32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e" Mar 20 11:34:13 crc kubenswrapper[4695]: I0320 11:34:13.712307 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e"} err="failed to get container status \"32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e\": rpc error: code = NotFound desc = could not find container \"32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e\": container with ID starting with 32ed8fc3db4367570985495b6e917cf268c75fe6649e4e13305fc0a495c0606e not found: ID does not exist" Mar 20 11:34:14 crc kubenswrapper[4695]: I0320 11:34:14.896753 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" path="/var/lib/kubelet/pods/f7130df8-d5de-4a52-a4f4-8ace5b8572f6/volumes" Mar 20 11:34:15 crc kubenswrapper[4695]: I0320 11:34:15.887714 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:34:15 crc kubenswrapper[4695]: E0320 11:34:15.888587 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:34:29 crc kubenswrapper[4695]: I0320 11:34:29.888091 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:34:29 crc kubenswrapper[4695]: E0320 11:34:29.889081 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:34:41 crc kubenswrapper[4695]: I0320 11:34:41.887566 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:34:41 crc kubenswrapper[4695]: E0320 11:34:41.888654 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:34:54 crc kubenswrapper[4695]: I0320 11:34:54.887597 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:34:54 crc kubenswrapper[4695]: E0320 11:34:54.888636 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:35:06 crc kubenswrapper[4695]: I0320 11:35:06.797356 4695 scope.go:117] "RemoveContainer" containerID="ca6707bc535c3aa99ca292467ec7b3b6ae4c1386a3dc23b9735ad120fae39464" Mar 20 11:35:08 crc kubenswrapper[4695]: I0320 11:35:08.888537 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:35:08 crc kubenswrapper[4695]: E0320 11:35:08.889371 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:35:20 crc kubenswrapper[4695]: I0320 11:35:20.887809 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:35:20 crc kubenswrapper[4695]: E0320 11:35:20.888766 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:35:31 crc kubenswrapper[4695]: I0320 11:35:31.887351 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:35:31 crc kubenswrapper[4695]: E0320 11:35:31.888549 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:35:46 crc kubenswrapper[4695]: I0320 11:35:46.078034 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:35:46 crc kubenswrapper[4695]: E0320 11:35:46.079053 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:35:59 crc kubenswrapper[4695]: I0320 11:35:59.886938 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:35:59 crc kubenswrapper[4695]: E0320 11:35:59.888167 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.151833 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566776-7mwns"] Mar 20 11:36:00 crc kubenswrapper[4695]: E0320 11:36:00.152315 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="extract-content" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.152341 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="extract-content" Mar 20 11:36:00 crc kubenswrapper[4695]: E0320 11:36:00.152359 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10b65570-3332-484b-8878-ad33a5716725" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.152369 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="10b65570-3332-484b-8878-ad33a5716725" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4695]: E0320 11:36:00.152388 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="registry-server" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.152396 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="registry-server" Mar 20 11:36:00 crc kubenswrapper[4695]: E0320 11:36:00.152412 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="extract-utilities" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.152419 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="extract-utilities" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.152629 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7130df8-d5de-4a52-a4f4-8ace5b8572f6" containerName="registry-server" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.152648 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="10b65570-3332-484b-8878-ad33a5716725" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.153286 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.156040 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.156497 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.166731 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-7mwns"] Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.168633 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.193148 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfrjf\" (UniqueName: \"kubernetes.io/projected/a225d102-4c7c-4cf6-b565-a248cc7583fb-kube-api-access-wfrjf\") pod \"auto-csr-approver-29566776-7mwns\" (UID: \"a225d102-4c7c-4cf6-b565-a248cc7583fb\") " pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.294971 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfrjf\" (UniqueName: \"kubernetes.io/projected/a225d102-4c7c-4cf6-b565-a248cc7583fb-kube-api-access-wfrjf\") pod \"auto-csr-approver-29566776-7mwns\" (UID: \"a225d102-4c7c-4cf6-b565-a248cc7583fb\") " pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.479298 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfrjf\" (UniqueName: \"kubernetes.io/projected/a225d102-4c7c-4cf6-b565-a248cc7583fb-kube-api-access-wfrjf\") pod \"auto-csr-approver-29566776-7mwns\" (UID: \"a225d102-4c7c-4cf6-b565-a248cc7583fb\") " pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.479814 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:00 crc kubenswrapper[4695]: I0320 11:36:00.941327 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-7mwns"] Mar 20 11:36:01 crc kubenswrapper[4695]: I0320 11:36:01.214689 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-7mwns" event={"ID":"a225d102-4c7c-4cf6-b565-a248cc7583fb","Type":"ContainerStarted","Data":"2fa833e79e595ded35c24dee5c4e874a56dbe73ba2564f8de2ccd9ea7becd227"} Mar 20 11:36:03 crc kubenswrapper[4695]: I0320 11:36:03.234857 4695 generic.go:334] "Generic (PLEG): container finished" podID="a225d102-4c7c-4cf6-b565-a248cc7583fb" containerID="e50bfd96d3f1499ab27cc93a9ceed268bb610157cac3e1312228fb1ee4341cc9" exitCode=0 Mar 20 11:36:03 crc kubenswrapper[4695]: I0320 11:36:03.234957 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-7mwns" event={"ID":"a225d102-4c7c-4cf6-b565-a248cc7583fb","Type":"ContainerDied","Data":"e50bfd96d3f1499ab27cc93a9ceed268bb610157cac3e1312228fb1ee4341cc9"} Mar 20 11:36:04 crc kubenswrapper[4695]: I0320 11:36:04.541826 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:04 crc kubenswrapper[4695]: I0320 11:36:04.697345 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfrjf\" (UniqueName: \"kubernetes.io/projected/a225d102-4c7c-4cf6-b565-a248cc7583fb-kube-api-access-wfrjf\") pod \"a225d102-4c7c-4cf6-b565-a248cc7583fb\" (UID: \"a225d102-4c7c-4cf6-b565-a248cc7583fb\") " Mar 20 11:36:04 crc kubenswrapper[4695]: I0320 11:36:04.704599 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a225d102-4c7c-4cf6-b565-a248cc7583fb-kube-api-access-wfrjf" (OuterVolumeSpecName: "kube-api-access-wfrjf") pod "a225d102-4c7c-4cf6-b565-a248cc7583fb" (UID: "a225d102-4c7c-4cf6-b565-a248cc7583fb"). InnerVolumeSpecName "kube-api-access-wfrjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:36:04 crc kubenswrapper[4695]: I0320 11:36:04.800076 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfrjf\" (UniqueName: \"kubernetes.io/projected/a225d102-4c7c-4cf6-b565-a248cc7583fb-kube-api-access-wfrjf\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:05 crc kubenswrapper[4695]: I0320 11:36:05.254420 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-7mwns" event={"ID":"a225d102-4c7c-4cf6-b565-a248cc7583fb","Type":"ContainerDied","Data":"2fa833e79e595ded35c24dee5c4e874a56dbe73ba2564f8de2ccd9ea7becd227"} Mar 20 11:36:05 crc kubenswrapper[4695]: I0320 11:36:05.254524 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2fa833e79e595ded35c24dee5c4e874a56dbe73ba2564f8de2ccd9ea7becd227" Mar 20 11:36:05 crc kubenswrapper[4695]: I0320 11:36:05.254572 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-7mwns" Mar 20 11:36:05 crc kubenswrapper[4695]: I0320 11:36:05.628572 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-vmtxw"] Mar 20 11:36:05 crc kubenswrapper[4695]: I0320 11:36:05.633823 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-vmtxw"] Mar 20 11:36:06 crc kubenswrapper[4695]: I0320 11:36:06.899579 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d23936-2917-4468-b9a2-ab9eb4cea0cf" path="/var/lib/kubelet/pods/56d23936-2917-4468-b9a2-ab9eb4cea0cf/volumes" Mar 20 11:36:12 crc kubenswrapper[4695]: I0320 11:36:12.891002 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:36:12 crc kubenswrapper[4695]: E0320 11:36:12.892039 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:36:24 crc kubenswrapper[4695]: I0320 11:36:24.888345 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:36:24 crc kubenswrapper[4695]: E0320 11:36:24.889403 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:36:36 crc kubenswrapper[4695]: I0320 11:36:36.887226 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:36:36 crc kubenswrapper[4695]: E0320 11:36:36.888382 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:36:49 crc kubenswrapper[4695]: I0320 11:36:49.888107 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:36:49 crc kubenswrapper[4695]: E0320 11:36:49.888940 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:37:02 crc kubenswrapper[4695]: I0320 11:37:02.891600 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:37:02 crc kubenswrapper[4695]: E0320 11:37:02.892701 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:37:06 crc kubenswrapper[4695]: I0320 11:37:06.906503 4695 scope.go:117] "RemoveContainer" containerID="3b2bf591e1cec82c24bf3f85332f2a58de4652db80000addab9d7258253dc6d3" Mar 20 11:37:17 crc kubenswrapper[4695]: I0320 11:37:17.887318 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:37:18 crc kubenswrapper[4695]: I0320 11:37:18.806845 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"75ca987e90ee0fcab0d5950d62b41475299b0a6565381ed22317c122106f8d27"} Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.155714 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566778-xn2gw"] Mar 20 11:38:00 crc kubenswrapper[4695]: E0320 11:38:00.157541 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a225d102-4c7c-4cf6-b565-a248cc7583fb" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.157567 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="a225d102-4c7c-4cf6-b565-a248cc7583fb" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.157784 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="a225d102-4c7c-4cf6-b565-a248cc7583fb" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.158600 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.161547 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.161936 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.163101 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.171138 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-xn2gw"] Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.319066 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zb5k\" (UniqueName: \"kubernetes.io/projected/715c3ab3-d767-4433-aac3-fbb7a527f63a-kube-api-access-7zb5k\") pod \"auto-csr-approver-29566778-xn2gw\" (UID: \"715c3ab3-d767-4433-aac3-fbb7a527f63a\") " pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.420868 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zb5k\" (UniqueName: \"kubernetes.io/projected/715c3ab3-d767-4433-aac3-fbb7a527f63a-kube-api-access-7zb5k\") pod \"auto-csr-approver-29566778-xn2gw\" (UID: \"715c3ab3-d767-4433-aac3-fbb7a527f63a\") " pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.445833 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zb5k\" (UniqueName: \"kubernetes.io/projected/715c3ab3-d767-4433-aac3-fbb7a527f63a-kube-api-access-7zb5k\") pod \"auto-csr-approver-29566778-xn2gw\" (UID: \"715c3ab3-d767-4433-aac3-fbb7a527f63a\") " pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.485976 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.925497 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:38:00 crc kubenswrapper[4695]: I0320 11:38:00.926213 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-xn2gw"] Mar 20 11:38:01 crc kubenswrapper[4695]: I0320 11:38:01.158057 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" event={"ID":"715c3ab3-d767-4433-aac3-fbb7a527f63a","Type":"ContainerStarted","Data":"75abed37101477d6fad453103b0066e88000dd617ddb110367659da36a971847"} Mar 20 11:38:02 crc kubenswrapper[4695]: I0320 11:38:02.167109 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" event={"ID":"715c3ab3-d767-4433-aac3-fbb7a527f63a","Type":"ContainerStarted","Data":"1950b28e687eeb720c55fa7ed6ec631462f740c9144ccf4b0a43afef767560dd"} Mar 20 11:38:03 crc kubenswrapper[4695]: I0320 11:38:03.187398 4695 generic.go:334] "Generic (PLEG): container finished" podID="715c3ab3-d767-4433-aac3-fbb7a527f63a" containerID="1950b28e687eeb720c55fa7ed6ec631462f740c9144ccf4b0a43afef767560dd" exitCode=0 Mar 20 11:38:03 crc kubenswrapper[4695]: I0320 11:38:03.187535 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" event={"ID":"715c3ab3-d767-4433-aac3-fbb7a527f63a","Type":"ContainerDied","Data":"1950b28e687eeb720c55fa7ed6ec631462f740c9144ccf4b0a43afef767560dd"} Mar 20 11:38:04 crc kubenswrapper[4695]: I0320 11:38:04.506488 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:04 crc kubenswrapper[4695]: I0320 11:38:04.697883 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zb5k\" (UniqueName: \"kubernetes.io/projected/715c3ab3-d767-4433-aac3-fbb7a527f63a-kube-api-access-7zb5k\") pod \"715c3ab3-d767-4433-aac3-fbb7a527f63a\" (UID: \"715c3ab3-d767-4433-aac3-fbb7a527f63a\") " Mar 20 11:38:04 crc kubenswrapper[4695]: I0320 11:38:04.704826 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/715c3ab3-d767-4433-aac3-fbb7a527f63a-kube-api-access-7zb5k" (OuterVolumeSpecName: "kube-api-access-7zb5k") pod "715c3ab3-d767-4433-aac3-fbb7a527f63a" (UID: "715c3ab3-d767-4433-aac3-fbb7a527f63a"). InnerVolumeSpecName "kube-api-access-7zb5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:38:04 crc kubenswrapper[4695]: I0320 11:38:04.800014 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zb5k\" (UniqueName: \"kubernetes.io/projected/715c3ab3-d767-4433-aac3-fbb7a527f63a-kube-api-access-7zb5k\") on node \"crc\" DevicePath \"\"" Mar 20 11:38:05 crc kubenswrapper[4695]: I0320 11:38:05.205444 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" Mar 20 11:38:05 crc kubenswrapper[4695]: I0320 11:38:05.205355 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-xn2gw" event={"ID":"715c3ab3-d767-4433-aac3-fbb7a527f63a","Type":"ContainerDied","Data":"75abed37101477d6fad453103b0066e88000dd617ddb110367659da36a971847"} Mar 20 11:38:05 crc kubenswrapper[4695]: I0320 11:38:05.208258 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75abed37101477d6fad453103b0066e88000dd617ddb110367659da36a971847" Mar 20 11:38:05 crc kubenswrapper[4695]: I0320 11:38:05.263200 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-qhj7r"] Mar 20 11:38:05 crc kubenswrapper[4695]: I0320 11:38:05.268732 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-qhj7r"] Mar 20 11:38:06 crc kubenswrapper[4695]: I0320 11:38:06.899929 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4110e916-7822-40e8-951c-1b6a864a0d2f" path="/var/lib/kubelet/pods/4110e916-7822-40e8-951c-1b6a864a0d2f/volumes" Mar 20 11:38:06 crc kubenswrapper[4695]: I0320 11:38:06.994018 4695 scope.go:117] "RemoveContainer" containerID="95bbe1a8ef45182ae69755ff696f95610c656c38f46b6f9588960c02eb3b944b" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.387139 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b4k74"] Mar 20 11:39:11 crc kubenswrapper[4695]: E0320 11:39:11.388064 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="715c3ab3-d767-4433-aac3-fbb7a527f63a" containerName="oc" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.388078 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="715c3ab3-d767-4433-aac3-fbb7a527f63a" containerName="oc" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.388236 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="715c3ab3-d767-4433-aac3-fbb7a527f63a" containerName="oc" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.389273 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.412340 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b4k74"] Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.487792 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-utilities\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.487851 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjlpp\" (UniqueName: \"kubernetes.io/projected/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-kube-api-access-sjlpp\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.487997 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-catalog-content\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.589458 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjlpp\" (UniqueName: \"kubernetes.io/projected/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-kube-api-access-sjlpp\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.590034 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-catalog-content\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.590142 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-utilities\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.590774 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-utilities\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.591035 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-catalog-content\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.617561 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjlpp\" (UniqueName: \"kubernetes.io/projected/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-kube-api-access-sjlpp\") pod \"redhat-operators-b4k74\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:11 crc kubenswrapper[4695]: I0320 11:39:11.712955 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:12 crc kubenswrapper[4695]: I0320 11:39:12.317520 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b4k74"] Mar 20 11:39:12 crc kubenswrapper[4695]: I0320 11:39:12.999525 4695 generic.go:334] "Generic (PLEG): container finished" podID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerID="ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90" exitCode=0 Mar 20 11:39:13 crc kubenswrapper[4695]: I0320 11:39:12.999598 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerDied","Data":"ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90"} Mar 20 11:39:13 crc kubenswrapper[4695]: I0320 11:39:12.999653 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerStarted","Data":"285739ea596d1c2577020804967d7f50abc42ca214dfccda9886e18b2aab0985"} Mar 20 11:39:14 crc kubenswrapper[4695]: I0320 11:39:14.022325 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerStarted","Data":"b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8"} Mar 20 11:39:16 crc kubenswrapper[4695]: I0320 11:39:16.040125 4695 generic.go:334] "Generic (PLEG): container finished" podID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerID="b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8" exitCode=0 Mar 20 11:39:16 crc kubenswrapper[4695]: I0320 11:39:16.040194 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerDied","Data":"b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8"} Mar 20 11:39:17 crc kubenswrapper[4695]: I0320 11:39:17.049481 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerStarted","Data":"9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7"} Mar 20 11:39:17 crc kubenswrapper[4695]: I0320 11:39:17.075534 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b4k74" podStartSLOduration=2.641731195 podStartE2EDuration="6.075508065s" podCreationTimestamp="2026-03-20 11:39:11 +0000 UTC" firstStartedPulling="2026-03-20 11:39:13.003082844 +0000 UTC m=+2730.783688407" lastFinishedPulling="2026-03-20 11:39:16.436859714 +0000 UTC m=+2734.217465277" observedRunningTime="2026-03-20 11:39:17.068864865 +0000 UTC m=+2734.849470448" watchObservedRunningTime="2026-03-20 11:39:17.075508065 +0000 UTC m=+2734.856113628" Mar 20 11:39:21 crc kubenswrapper[4695]: I0320 11:39:21.713803 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:21 crc kubenswrapper[4695]: I0320 11:39:21.714271 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:22 crc kubenswrapper[4695]: I0320 11:39:22.779653 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b4k74" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="registry-server" probeResult="failure" output=< Mar 20 11:39:22 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 11:39:22 crc kubenswrapper[4695]: > Mar 20 11:39:29 crc kubenswrapper[4695]: I0320 11:39:29.806898 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6fzbs"] Mar 20 11:39:29 crc kubenswrapper[4695]: I0320 11:39:29.812215 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:29 crc kubenswrapper[4695]: I0320 11:39:29.826981 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fzbs"] Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.008024 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-catalog-content\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.008128 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-utilities\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.008174 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m5pd\" (UniqueName: \"kubernetes.io/projected/22f6f48c-10fc-4f0b-9326-e9a243983e97-kube-api-access-7m5pd\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.109305 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-utilities\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.109399 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m5pd\" (UniqueName: \"kubernetes.io/projected/22f6f48c-10fc-4f0b-9326-e9a243983e97-kube-api-access-7m5pd\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.109458 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-catalog-content\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.109833 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-utilities\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.110161 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-catalog-content\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.133274 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m5pd\" (UniqueName: \"kubernetes.io/projected/22f6f48c-10fc-4f0b-9326-e9a243983e97-kube-api-access-7m5pd\") pod \"certified-operators-6fzbs\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.142604 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:30 crc kubenswrapper[4695]: I0320 11:39:30.788773 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6fzbs"] Mar 20 11:39:31 crc kubenswrapper[4695]: I0320 11:39:31.149285 4695 generic.go:334] "Generic (PLEG): container finished" podID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerID="1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d" exitCode=0 Mar 20 11:39:31 crc kubenswrapper[4695]: I0320 11:39:31.149384 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerDied","Data":"1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d"} Mar 20 11:39:31 crc kubenswrapper[4695]: I0320 11:39:31.149649 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerStarted","Data":"c29e572d35aae2f220ef1f7ad10a9b4077503739b6b4ac20c72f973979b6e900"} Mar 20 11:39:31 crc kubenswrapper[4695]: I0320 11:39:31.834238 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:31 crc kubenswrapper[4695]: I0320 11:39:31.885478 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:33 crc kubenswrapper[4695]: I0320 11:39:33.178820 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerStarted","Data":"01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf"} Mar 20 11:39:33 crc kubenswrapper[4695]: E0320 11:39:33.818371 4695 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f6f48c_10fc_4f0b_9326_e9a243983e97.slice/crio-conmon-01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf.scope\": RecentStats: unable to find data in memory cache]" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.183926 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b4k74"] Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.184369 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b4k74" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="registry-server" containerID="cri-o://9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7" gracePeriod=2 Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.193395 4695 generic.go:334] "Generic (PLEG): container finished" podID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerID="01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf" exitCode=0 Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.193431 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerDied","Data":"01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf"} Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.574162 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.622669 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-utilities\") pod \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.622959 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-catalog-content\") pod \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.623011 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjlpp\" (UniqueName: \"kubernetes.io/projected/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-kube-api-access-sjlpp\") pod \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\" (UID: \"6f6b59cc-b34d-4601-a8ee-446cba21ea2e\") " Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.624038 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-utilities" (OuterVolumeSpecName: "utilities") pod "6f6b59cc-b34d-4601-a8ee-446cba21ea2e" (UID: "6f6b59cc-b34d-4601-a8ee-446cba21ea2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.629264 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-kube-api-access-sjlpp" (OuterVolumeSpecName: "kube-api-access-sjlpp") pod "6f6b59cc-b34d-4601-a8ee-446cba21ea2e" (UID: "6f6b59cc-b34d-4601-a8ee-446cba21ea2e"). InnerVolumeSpecName "kube-api-access-sjlpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.726509 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjlpp\" (UniqueName: \"kubernetes.io/projected/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-kube-api-access-sjlpp\") on node \"crc\" DevicePath \"\"" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.726552 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.759771 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f6b59cc-b34d-4601-a8ee-446cba21ea2e" (UID: "6f6b59cc-b34d-4601-a8ee-446cba21ea2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:39:34 crc kubenswrapper[4695]: I0320 11:39:34.828375 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f6b59cc-b34d-4601-a8ee-446cba21ea2e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.204181 4695 generic.go:334] "Generic (PLEG): container finished" podID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerID="9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7" exitCode=0 Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.204291 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerDied","Data":"9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7"} Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.204725 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b4k74" event={"ID":"6f6b59cc-b34d-4601-a8ee-446cba21ea2e","Type":"ContainerDied","Data":"285739ea596d1c2577020804967d7f50abc42ca214dfccda9886e18b2aab0985"} Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.204787 4695 scope.go:117] "RemoveContainer" containerID="9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.204330 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b4k74" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.212430 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerStarted","Data":"a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b"} Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.228660 4695 scope.go:117] "RemoveContainer" containerID="b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.232179 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b4k74"] Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.238370 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b4k74"] Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.247868 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6fzbs" podStartSLOduration=2.839308473 podStartE2EDuration="6.24784371s" podCreationTimestamp="2026-03-20 11:39:29 +0000 UTC" firstStartedPulling="2026-03-20 11:39:31.151427473 +0000 UTC m=+2748.932033036" lastFinishedPulling="2026-03-20 11:39:34.55996271 +0000 UTC m=+2752.340568273" observedRunningTime="2026-03-20 11:39:35.247599523 +0000 UTC m=+2753.028205086" watchObservedRunningTime="2026-03-20 11:39:35.24784371 +0000 UTC m=+2753.028449273" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.257288 4695 scope.go:117] "RemoveContainer" containerID="ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.273295 4695 scope.go:117] "RemoveContainer" containerID="9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7" Mar 20 11:39:35 crc kubenswrapper[4695]: E0320 11:39:35.273722 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7\": container with ID starting with 9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7 not found: ID does not exist" containerID="9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.273759 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7"} err="failed to get container status \"9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7\": rpc error: code = NotFound desc = could not find container \"9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7\": container with ID starting with 9df57c379e97d439ba9bbf3a02c831b0346e6d214ad3de70fe961ce18354a3f7 not found: ID does not exist" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.273782 4695 scope.go:117] "RemoveContainer" containerID="b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8" Mar 20 11:39:35 crc kubenswrapper[4695]: E0320 11:39:35.274224 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8\": container with ID starting with b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8 not found: ID does not exist" containerID="b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.274312 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8"} err="failed to get container status \"b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8\": rpc error: code = NotFound desc = could not find container \"b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8\": container with ID starting with b7149a599cc1a181fce16ee845ec950e9a988a15205a0789873dec62d98dede8 not found: ID does not exist" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.274364 4695 scope.go:117] "RemoveContainer" containerID="ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90" Mar 20 11:39:35 crc kubenswrapper[4695]: E0320 11:39:35.274784 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90\": container with ID starting with ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90 not found: ID does not exist" containerID="ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90" Mar 20 11:39:35 crc kubenswrapper[4695]: I0320 11:39:35.274813 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90"} err="failed to get container status \"ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90\": rpc error: code = NotFound desc = could not find container \"ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90\": container with ID starting with ca6866db685995d699fff376eaae12f185b3624fc5c8d5f9178f32c1d2c95d90 not found: ID does not exist" Mar 20 11:39:36 crc kubenswrapper[4695]: I0320 11:39:36.901723 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" path="/var/lib/kubelet/pods/6f6b59cc-b34d-4601-a8ee-446cba21ea2e/volumes" Mar 20 11:39:38 crc kubenswrapper[4695]: I0320 11:39:38.431559 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:39:38 crc kubenswrapper[4695]: I0320 11:39:38.432001 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:39:40 crc kubenswrapper[4695]: I0320 11:39:40.143627 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:40 crc kubenswrapper[4695]: I0320 11:39:40.143696 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:40 crc kubenswrapper[4695]: I0320 11:39:40.184240 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:40 crc kubenswrapper[4695]: I0320 11:39:40.282078 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:40 crc kubenswrapper[4695]: I0320 11:39:40.422712 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fzbs"] Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.258867 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6fzbs" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="registry-server" containerID="cri-o://a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b" gracePeriod=2 Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.762671 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.843992 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m5pd\" (UniqueName: \"kubernetes.io/projected/22f6f48c-10fc-4f0b-9326-e9a243983e97-kube-api-access-7m5pd\") pod \"22f6f48c-10fc-4f0b-9326-e9a243983e97\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.844088 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-utilities\") pod \"22f6f48c-10fc-4f0b-9326-e9a243983e97\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.844116 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-catalog-content\") pod \"22f6f48c-10fc-4f0b-9326-e9a243983e97\" (UID: \"22f6f48c-10fc-4f0b-9326-e9a243983e97\") " Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.845140 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-utilities" (OuterVolumeSpecName: "utilities") pod "22f6f48c-10fc-4f0b-9326-e9a243983e97" (UID: "22f6f48c-10fc-4f0b-9326-e9a243983e97"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.850061 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f6f48c-10fc-4f0b-9326-e9a243983e97-kube-api-access-7m5pd" (OuterVolumeSpecName: "kube-api-access-7m5pd") pod "22f6f48c-10fc-4f0b-9326-e9a243983e97" (UID: "22f6f48c-10fc-4f0b-9326-e9a243983e97"). InnerVolumeSpecName "kube-api-access-7m5pd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.908856 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22f6f48c-10fc-4f0b-9326-e9a243983e97" (UID: "22f6f48c-10fc-4f0b-9326-e9a243983e97"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.946303 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.946342 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22f6f48c-10fc-4f0b-9326-e9a243983e97-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:39:42 crc kubenswrapper[4695]: I0320 11:39:42.946359 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m5pd\" (UniqueName: \"kubernetes.io/projected/22f6f48c-10fc-4f0b-9326-e9a243983e97-kube-api-access-7m5pd\") on node \"crc\" DevicePath \"\"" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.268301 4695 generic.go:334] "Generic (PLEG): container finished" podID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerID="a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b" exitCode=0 Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.268354 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerDied","Data":"a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b"} Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.268361 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6fzbs" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.268395 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6fzbs" event={"ID":"22f6f48c-10fc-4f0b-9326-e9a243983e97","Type":"ContainerDied","Data":"c29e572d35aae2f220ef1f7ad10a9b4077503739b6b4ac20c72f973979b6e900"} Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.268421 4695 scope.go:117] "RemoveContainer" containerID="a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.305323 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6fzbs"] Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.305981 4695 scope.go:117] "RemoveContainer" containerID="01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.311823 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6fzbs"] Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.328105 4695 scope.go:117] "RemoveContainer" containerID="1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.348935 4695 scope.go:117] "RemoveContainer" containerID="a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b" Mar 20 11:39:43 crc kubenswrapper[4695]: E0320 11:39:43.349325 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b\": container with ID starting with a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b not found: ID does not exist" containerID="a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.349372 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b"} err="failed to get container status \"a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b\": rpc error: code = NotFound desc = could not find container \"a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b\": container with ID starting with a1d910d8f12d76460b8baf43f57b04753b282971d0e362698e5ddb4a5732c99b not found: ID does not exist" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.349402 4695 scope.go:117] "RemoveContainer" containerID="01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf" Mar 20 11:39:43 crc kubenswrapper[4695]: E0320 11:39:43.349760 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf\": container with ID starting with 01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf not found: ID does not exist" containerID="01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.349787 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf"} err="failed to get container status \"01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf\": rpc error: code = NotFound desc = could not find container \"01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf\": container with ID starting with 01b040bea1be4f3a0405f25fd6d46f49419c936b9814f56b0a4f1caaba023cdf not found: ID does not exist" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.349807 4695 scope.go:117] "RemoveContainer" containerID="1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d" Mar 20 11:39:43 crc kubenswrapper[4695]: E0320 11:39:43.350398 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d\": container with ID starting with 1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d not found: ID does not exist" containerID="1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d" Mar 20 11:39:43 crc kubenswrapper[4695]: I0320 11:39:43.350446 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d"} err="failed to get container status \"1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d\": rpc error: code = NotFound desc = could not find container \"1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d\": container with ID starting with 1d973fa2b29ef0a813bb1294cd477476ac711eb1487f4d9402ddf4e02c12903d not found: ID does not exist" Mar 20 11:39:44 crc kubenswrapper[4695]: I0320 11:39:44.895586 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" path="/var/lib/kubelet/pods/22f6f48c-10fc-4f0b-9326-e9a243983e97/volumes" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.148613 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566780-p8xjb"] Mar 20 11:40:00 crc kubenswrapper[4695]: E0320 11:40:00.149852 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="registry-server" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.149876 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="registry-server" Mar 20 11:40:00 crc kubenswrapper[4695]: E0320 11:40:00.149903 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="extract-content" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.149940 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="extract-content" Mar 20 11:40:00 crc kubenswrapper[4695]: E0320 11:40:00.149966 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="registry-server" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.149980 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="registry-server" Mar 20 11:40:00 crc kubenswrapper[4695]: E0320 11:40:00.149999 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="extract-content" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.150010 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="extract-content" Mar 20 11:40:00 crc kubenswrapper[4695]: E0320 11:40:00.150032 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="extract-utilities" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.150042 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="extract-utilities" Mar 20 11:40:00 crc kubenswrapper[4695]: E0320 11:40:00.150060 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="extract-utilities" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.150071 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="extract-utilities" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.150352 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f6b59cc-b34d-4601-a8ee-446cba21ea2e" containerName="registry-server" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.150370 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f6f48c-10fc-4f0b-9326-e9a243983e97" containerName="registry-server" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.151093 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.153039 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.153441 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.154827 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.157309 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-p8xjb"] Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.215565 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49vbb\" (UniqueName: \"kubernetes.io/projected/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a-kube-api-access-49vbb\") pod \"auto-csr-approver-29566780-p8xjb\" (UID: \"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a\") " pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.316978 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49vbb\" (UniqueName: \"kubernetes.io/projected/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a-kube-api-access-49vbb\") pod \"auto-csr-approver-29566780-p8xjb\" (UID: \"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a\") " pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.337681 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49vbb\" (UniqueName: \"kubernetes.io/projected/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a-kube-api-access-49vbb\") pod \"auto-csr-approver-29566780-p8xjb\" (UID: \"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a\") " pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.476698 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:00 crc kubenswrapper[4695]: I0320 11:40:00.898059 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-p8xjb"] Mar 20 11:40:01 crc kubenswrapper[4695]: I0320 11:40:01.400463 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" event={"ID":"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a","Type":"ContainerStarted","Data":"68296450e6aaa81da9e6479b3ec48ff1711b3a2eb007802611204ed533c5245f"} Mar 20 11:40:02 crc kubenswrapper[4695]: I0320 11:40:02.420618 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" event={"ID":"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a","Type":"ContainerStarted","Data":"3331d7637395abfd19c3dbdfcadee04e194eaac23fbefc5ff9b5688c6575291b"} Mar 20 11:40:02 crc kubenswrapper[4695]: I0320 11:40:02.440721 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" podStartSLOduration=1.313406173 podStartE2EDuration="2.440695534s" podCreationTimestamp="2026-03-20 11:40:00 +0000 UTC" firstStartedPulling="2026-03-20 11:40:00.906703759 +0000 UTC m=+2778.687309322" lastFinishedPulling="2026-03-20 11:40:02.03399308 +0000 UTC m=+2779.814598683" observedRunningTime="2026-03-20 11:40:02.437069661 +0000 UTC m=+2780.217675244" watchObservedRunningTime="2026-03-20 11:40:02.440695534 +0000 UTC m=+2780.221301097" Mar 20 11:40:03 crc kubenswrapper[4695]: I0320 11:40:03.428751 4695 generic.go:334] "Generic (PLEG): container finished" podID="9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a" containerID="3331d7637395abfd19c3dbdfcadee04e194eaac23fbefc5ff9b5688c6575291b" exitCode=0 Mar 20 11:40:03 crc kubenswrapper[4695]: I0320 11:40:03.428802 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" event={"ID":"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a","Type":"ContainerDied","Data":"3331d7637395abfd19c3dbdfcadee04e194eaac23fbefc5ff9b5688c6575291b"} Mar 20 11:40:04 crc kubenswrapper[4695]: I0320 11:40:04.709979 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:04 crc kubenswrapper[4695]: I0320 11:40:04.782739 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49vbb\" (UniqueName: \"kubernetes.io/projected/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a-kube-api-access-49vbb\") pod \"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a\" (UID: \"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a\") " Mar 20 11:40:04 crc kubenswrapper[4695]: I0320 11:40:04.789936 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a-kube-api-access-49vbb" (OuterVolumeSpecName: "kube-api-access-49vbb") pod "9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a" (UID: "9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a"). InnerVolumeSpecName "kube-api-access-49vbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:40:04 crc kubenswrapper[4695]: I0320 11:40:04.884315 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49vbb\" (UniqueName: \"kubernetes.io/projected/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a-kube-api-access-49vbb\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:05 crc kubenswrapper[4695]: I0320 11:40:05.445328 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" event={"ID":"9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a","Type":"ContainerDied","Data":"68296450e6aaa81da9e6479b3ec48ff1711b3a2eb007802611204ed533c5245f"} Mar 20 11:40:05 crc kubenswrapper[4695]: I0320 11:40:05.445371 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68296450e6aaa81da9e6479b3ec48ff1711b3a2eb007802611204ed533c5245f" Mar 20 11:40:05 crc kubenswrapper[4695]: I0320 11:40:05.445424 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-p8xjb" Mar 20 11:40:05 crc kubenswrapper[4695]: I0320 11:40:05.508516 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-4xqkj"] Mar 20 11:40:05 crc kubenswrapper[4695]: I0320 11:40:05.515110 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-4xqkj"] Mar 20 11:40:06 crc kubenswrapper[4695]: I0320 11:40:06.902392 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10b65570-3332-484b-8878-ad33a5716725" path="/var/lib/kubelet/pods/10b65570-3332-484b-8878-ad33a5716725/volumes" Mar 20 11:40:07 crc kubenswrapper[4695]: I0320 11:40:07.092636 4695 scope.go:117] "RemoveContainer" containerID="0939349a9e09c57aeb3da6ecab4682362e1799e4d306026f942bc427de6c773c" Mar 20 11:40:08 crc kubenswrapper[4695]: I0320 11:40:08.431093 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:08 crc kubenswrapper[4695]: I0320 11:40:08.431569 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.431012 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.431655 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.431738 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.432754 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75ca987e90ee0fcab0d5950d62b41475299b0a6565381ed22317c122106f8d27"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.432834 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://75ca987e90ee0fcab0d5950d62b41475299b0a6565381ed22317c122106f8d27" gracePeriod=600 Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.677387 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="75ca987e90ee0fcab0d5950d62b41475299b0a6565381ed22317c122106f8d27" exitCode=0 Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.677459 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"75ca987e90ee0fcab0d5950d62b41475299b0a6565381ed22317c122106f8d27"} Mar 20 11:40:38 crc kubenswrapper[4695]: I0320 11:40:38.677866 4695 scope.go:117] "RemoveContainer" containerID="36c75bd77d7ea36a7a3a09991a5d5621e2ec2bdcf81258a215fc68c11e3cf0bd" Mar 20 11:40:39 crc kubenswrapper[4695]: I0320 11:40:39.687843 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683"} Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.146509 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566782-jckgt"] Mar 20 11:42:00 crc kubenswrapper[4695]: E0320 11:42:00.147544 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a" containerName="oc" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.147563 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a" containerName="oc" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.147750 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a" containerName="oc" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.148414 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.151601 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.151940 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.152801 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.162149 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-jckgt"] Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.190275 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/93b18780-f249-4ce5-928e-4ede3fde4360-kube-api-access-sks2x\") pod \"auto-csr-approver-29566782-jckgt\" (UID: \"93b18780-f249-4ce5-928e-4ede3fde4360\") " pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.292193 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/93b18780-f249-4ce5-928e-4ede3fde4360-kube-api-access-sks2x\") pod \"auto-csr-approver-29566782-jckgt\" (UID: \"93b18780-f249-4ce5-928e-4ede3fde4360\") " pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.311871 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/93b18780-f249-4ce5-928e-4ede3fde4360-kube-api-access-sks2x\") pod \"auto-csr-approver-29566782-jckgt\" (UID: \"93b18780-f249-4ce5-928e-4ede3fde4360\") " pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.472361 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:00 crc kubenswrapper[4695]: I0320 11:42:00.876350 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-jckgt"] Mar 20 11:42:01 crc kubenswrapper[4695]: I0320 11:42:01.248595 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-jckgt" event={"ID":"93b18780-f249-4ce5-928e-4ede3fde4360","Type":"ContainerStarted","Data":"7eac50da161fda007b49b72edb7d5119b57b3993ec06e581d488c0ad68fe44f8"} Mar 20 11:42:03 crc kubenswrapper[4695]: I0320 11:42:03.267103 4695 generic.go:334] "Generic (PLEG): container finished" podID="93b18780-f249-4ce5-928e-4ede3fde4360" containerID="5ddb32697daba259741bc8d70ef818ef1c6f1db478d1eab954d023d041ff418b" exitCode=0 Mar 20 11:42:03 crc kubenswrapper[4695]: I0320 11:42:03.267193 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-jckgt" event={"ID":"93b18780-f249-4ce5-928e-4ede3fde4360","Type":"ContainerDied","Data":"5ddb32697daba259741bc8d70ef818ef1c6f1db478d1eab954d023d041ff418b"} Mar 20 11:42:04 crc kubenswrapper[4695]: I0320 11:42:04.547984 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:04 crc kubenswrapper[4695]: I0320 11:42:04.655972 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/93b18780-f249-4ce5-928e-4ede3fde4360-kube-api-access-sks2x\") pod \"93b18780-f249-4ce5-928e-4ede3fde4360\" (UID: \"93b18780-f249-4ce5-928e-4ede3fde4360\") " Mar 20 11:42:04 crc kubenswrapper[4695]: I0320 11:42:04.662727 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b18780-f249-4ce5-928e-4ede3fde4360-kube-api-access-sks2x" (OuterVolumeSpecName: "kube-api-access-sks2x") pod "93b18780-f249-4ce5-928e-4ede3fde4360" (UID: "93b18780-f249-4ce5-928e-4ede3fde4360"). InnerVolumeSpecName "kube-api-access-sks2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:42:04 crc kubenswrapper[4695]: I0320 11:42:04.758253 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sks2x\" (UniqueName: \"kubernetes.io/projected/93b18780-f249-4ce5-928e-4ede3fde4360-kube-api-access-sks2x\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:05 crc kubenswrapper[4695]: I0320 11:42:05.283617 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-jckgt" event={"ID":"93b18780-f249-4ce5-928e-4ede3fde4360","Type":"ContainerDied","Data":"7eac50da161fda007b49b72edb7d5119b57b3993ec06e581d488c0ad68fe44f8"} Mar 20 11:42:05 crc kubenswrapper[4695]: I0320 11:42:05.283656 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-jckgt" Mar 20 11:42:05 crc kubenswrapper[4695]: I0320 11:42:05.283661 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eac50da161fda007b49b72edb7d5119b57b3993ec06e581d488c0ad68fe44f8" Mar 20 11:42:05 crc kubenswrapper[4695]: I0320 11:42:05.614105 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-7mwns"] Mar 20 11:42:05 crc kubenswrapper[4695]: I0320 11:42:05.619669 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-7mwns"] Mar 20 11:42:06 crc kubenswrapper[4695]: I0320 11:42:06.896216 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a225d102-4c7c-4cf6-b565-a248cc7583fb" path="/var/lib/kubelet/pods/a225d102-4c7c-4cf6-b565-a248cc7583fb/volumes" Mar 20 11:42:07 crc kubenswrapper[4695]: I0320 11:42:07.199932 4695 scope.go:117] "RemoveContainer" containerID="e50bfd96d3f1499ab27cc93a9ceed268bb610157cac3e1312228fb1ee4341cc9" Mar 20 11:42:38 crc kubenswrapper[4695]: I0320 11:42:38.431228 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:42:38 crc kubenswrapper[4695]: I0320 11:42:38.433038 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.144266 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-txvwn"] Mar 20 11:42:53 crc kubenswrapper[4695]: E0320 11:42:53.145220 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b18780-f249-4ce5-928e-4ede3fde4360" containerName="oc" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.145233 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b18780-f249-4ce5-928e-4ede3fde4360" containerName="oc" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.145431 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b18780-f249-4ce5-928e-4ede3fde4360" containerName="oc" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.146393 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.160115 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctlz7\" (UniqueName: \"kubernetes.io/projected/ff1b4a44-3245-417e-b97d-04983202ba5a-kube-api-access-ctlz7\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.160227 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-utilities\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.160284 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-catalog-content\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.161420 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txvwn"] Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.261282 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-utilities\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.261418 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-catalog-content\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.261487 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctlz7\" (UniqueName: \"kubernetes.io/projected/ff1b4a44-3245-417e-b97d-04983202ba5a-kube-api-access-ctlz7\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.262153 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-catalog-content\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.262184 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-utilities\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.286623 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctlz7\" (UniqueName: \"kubernetes.io/projected/ff1b4a44-3245-417e-b97d-04983202ba5a-kube-api-access-ctlz7\") pod \"community-operators-txvwn\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:53 crc kubenswrapper[4695]: I0320 11:42:53.480548 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:42:54 crc kubenswrapper[4695]: I0320 11:42:54.406800 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-txvwn"] Mar 20 11:42:55 crc kubenswrapper[4695]: I0320 11:42:55.096694 4695 generic.go:334] "Generic (PLEG): container finished" podID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerID="fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe" exitCode=0 Mar 20 11:42:55 crc kubenswrapper[4695]: I0320 11:42:55.096757 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txvwn" event={"ID":"ff1b4a44-3245-417e-b97d-04983202ba5a","Type":"ContainerDied","Data":"fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe"} Mar 20 11:42:55 crc kubenswrapper[4695]: I0320 11:42:55.096853 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txvwn" event={"ID":"ff1b4a44-3245-417e-b97d-04983202ba5a","Type":"ContainerStarted","Data":"8f8e5972074ab08241e37e927b58d8cbcb4c6763b624d3fd16880e236d8114f1"} Mar 20 11:42:57 crc kubenswrapper[4695]: I0320 11:42:57.114464 4695 generic.go:334] "Generic (PLEG): container finished" podID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerID="3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38" exitCode=0 Mar 20 11:42:57 crc kubenswrapper[4695]: I0320 11:42:57.114564 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txvwn" event={"ID":"ff1b4a44-3245-417e-b97d-04983202ba5a","Type":"ContainerDied","Data":"3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38"} Mar 20 11:42:58 crc kubenswrapper[4695]: I0320 11:42:58.126735 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txvwn" event={"ID":"ff1b4a44-3245-417e-b97d-04983202ba5a","Type":"ContainerStarted","Data":"78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478"} Mar 20 11:42:58 crc kubenswrapper[4695]: I0320 11:42:58.156750 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-txvwn" podStartSLOduration=2.425547081 podStartE2EDuration="5.156715697s" podCreationTimestamp="2026-03-20 11:42:53 +0000 UTC" firstStartedPulling="2026-03-20 11:42:55.099054097 +0000 UTC m=+2952.879659650" lastFinishedPulling="2026-03-20 11:42:57.830222693 +0000 UTC m=+2955.610828266" observedRunningTime="2026-03-20 11:42:58.151762581 +0000 UTC m=+2955.932368164" watchObservedRunningTime="2026-03-20 11:42:58.156715697 +0000 UTC m=+2955.937321270" Mar 20 11:43:03 crc kubenswrapper[4695]: I0320 11:43:03.480832 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:43:03 crc kubenswrapper[4695]: I0320 11:43:03.481343 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:43:03 crc kubenswrapper[4695]: I0320 11:43:03.535994 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:43:04 crc kubenswrapper[4695]: I0320 11:43:04.236813 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:43:04 crc kubenswrapper[4695]: I0320 11:43:04.297545 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txvwn"] Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.193489 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-txvwn" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="registry-server" containerID="cri-o://78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478" gracePeriod=2 Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.661012 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.737240 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-utilities\") pod \"ff1b4a44-3245-417e-b97d-04983202ba5a\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.737363 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctlz7\" (UniqueName: \"kubernetes.io/projected/ff1b4a44-3245-417e-b97d-04983202ba5a-kube-api-access-ctlz7\") pod \"ff1b4a44-3245-417e-b97d-04983202ba5a\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.737390 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-catalog-content\") pod \"ff1b4a44-3245-417e-b97d-04983202ba5a\" (UID: \"ff1b4a44-3245-417e-b97d-04983202ba5a\") " Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.738680 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-utilities" (OuterVolumeSpecName: "utilities") pod "ff1b4a44-3245-417e-b97d-04983202ba5a" (UID: "ff1b4a44-3245-417e-b97d-04983202ba5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.745853 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff1b4a44-3245-417e-b97d-04983202ba5a-kube-api-access-ctlz7" (OuterVolumeSpecName: "kube-api-access-ctlz7") pod "ff1b4a44-3245-417e-b97d-04983202ba5a" (UID: "ff1b4a44-3245-417e-b97d-04983202ba5a"). InnerVolumeSpecName "kube-api-access-ctlz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.839840 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:06 crc kubenswrapper[4695]: I0320 11:43:06.839902 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctlz7\" (UniqueName: \"kubernetes.io/projected/ff1b4a44-3245-417e-b97d-04983202ba5a-kube-api-access-ctlz7\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.202437 4695 generic.go:334] "Generic (PLEG): container finished" podID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerID="78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478" exitCode=0 Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.202587 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-txvwn" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.202592 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txvwn" event={"ID":"ff1b4a44-3245-417e-b97d-04983202ba5a","Type":"ContainerDied","Data":"78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478"} Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.202999 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-txvwn" event={"ID":"ff1b4a44-3245-417e-b97d-04983202ba5a","Type":"ContainerDied","Data":"8f8e5972074ab08241e37e927b58d8cbcb4c6763b624d3fd16880e236d8114f1"} Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.203025 4695 scope.go:117] "RemoveContainer" containerID="78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.223893 4695 scope.go:117] "RemoveContainer" containerID="3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.241269 4695 scope.go:117] "RemoveContainer" containerID="fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.263445 4695 scope.go:117] "RemoveContainer" containerID="78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478" Mar 20 11:43:07 crc kubenswrapper[4695]: E0320 11:43:07.264241 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478\": container with ID starting with 78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478 not found: ID does not exist" containerID="78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.264268 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478"} err="failed to get container status \"78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478\": rpc error: code = NotFound desc = could not find container \"78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478\": container with ID starting with 78622887b373e1e4be56f45ea7b187bf5439c090a11a3865836036fc48aab478 not found: ID does not exist" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.264292 4695 scope.go:117] "RemoveContainer" containerID="3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38" Mar 20 11:43:07 crc kubenswrapper[4695]: E0320 11:43:07.264509 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38\": container with ID starting with 3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38 not found: ID does not exist" containerID="3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.264559 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38"} err="failed to get container status \"3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38\": rpc error: code = NotFound desc = could not find container \"3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38\": container with ID starting with 3650588d8b82e9a4a8f511698f94f1dc5f2f0ddcc11f966a046e57b81f9afb38 not found: ID does not exist" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.264582 4695 scope.go:117] "RemoveContainer" containerID="fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe" Mar 20 11:43:07 crc kubenswrapper[4695]: E0320 11:43:07.264866 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe\": container with ID starting with fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe not found: ID does not exist" containerID="fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.264936 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe"} err="failed to get container status \"fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe\": rpc error: code = NotFound desc = could not find container \"fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe\": container with ID starting with fd7232c4680842f8753a807f29a9b557c92b77e54819732dc6ba914789299ffe not found: ID does not exist" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.274977 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff1b4a44-3245-417e-b97d-04983202ba5a" (UID: "ff1b4a44-3245-417e-b97d-04983202ba5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.348299 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff1b4a44-3245-417e-b97d-04983202ba5a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.535383 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-txvwn"] Mar 20 11:43:07 crc kubenswrapper[4695]: I0320 11:43:07.541393 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-txvwn"] Mar 20 11:43:07 crc kubenswrapper[4695]: E0320 11:43:07.594390 4695 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff1b4a44_3245_417e_b97d_04983202ba5a.slice\": RecentStats: unable to find data in memory cache]" Mar 20 11:43:08 crc kubenswrapper[4695]: I0320 11:43:08.431498 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:43:08 crc kubenswrapper[4695]: I0320 11:43:08.431589 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:08 crc kubenswrapper[4695]: I0320 11:43:08.904342 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" path="/var/lib/kubelet/pods/ff1b4a44-3245-417e-b97d-04983202ba5a/volumes" Mar 20 11:43:38 crc kubenswrapper[4695]: I0320 11:43:38.431307 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:43:38 crc kubenswrapper[4695]: I0320 11:43:38.432042 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:43:38 crc kubenswrapper[4695]: I0320 11:43:38.432089 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:43:38 crc kubenswrapper[4695]: I0320 11:43:38.432741 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:43:38 crc kubenswrapper[4695]: I0320 11:43:38.432792 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" gracePeriod=600 Mar 20 11:43:38 crc kubenswrapper[4695]: E0320 11:43:38.553532 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:43:39 crc kubenswrapper[4695]: I0320 11:43:39.523491 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" exitCode=0 Mar 20 11:43:39 crc kubenswrapper[4695]: I0320 11:43:39.523550 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683"} Mar 20 11:43:39 crc kubenswrapper[4695]: I0320 11:43:39.523601 4695 scope.go:117] "RemoveContainer" containerID="75ca987e90ee0fcab0d5950d62b41475299b0a6565381ed22317c122106f8d27" Mar 20 11:43:39 crc kubenswrapper[4695]: I0320 11:43:39.524359 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:43:39 crc kubenswrapper[4695]: E0320 11:43:39.524852 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:43:52 crc kubenswrapper[4695]: I0320 11:43:52.894416 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:43:52 crc kubenswrapper[4695]: E0320 11:43:52.895490 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.140686 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566784-nsc7l"] Mar 20 11:44:00 crc kubenswrapper[4695]: E0320 11:44:00.141725 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.141745 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4695]: E0320 11:44:00.141764 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.141770 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4695]: E0320 11:44:00.141783 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.141791 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.141984 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff1b4a44-3245-417e-b97d-04983202ba5a" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.142606 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.144528 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.145793 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.146232 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.150596 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-nsc7l"] Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.215958 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtcvj\" (UniqueName: \"kubernetes.io/projected/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0-kube-api-access-wtcvj\") pod \"auto-csr-approver-29566784-nsc7l\" (UID: \"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0\") " pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.318323 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtcvj\" (UniqueName: \"kubernetes.io/projected/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0-kube-api-access-wtcvj\") pod \"auto-csr-approver-29566784-nsc7l\" (UID: \"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0\") " pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.348370 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtcvj\" (UniqueName: \"kubernetes.io/projected/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0-kube-api-access-wtcvj\") pod \"auto-csr-approver-29566784-nsc7l\" (UID: \"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0\") " pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.463069 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.974386 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:44:00 crc kubenswrapper[4695]: I0320 11:44:00.981149 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-nsc7l"] Mar 20 11:44:01 crc kubenswrapper[4695]: I0320 11:44:01.696418 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" event={"ID":"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0","Type":"ContainerStarted","Data":"2616c1cabec4d947ad8e3ee387bd842dee051f9a674e3b7eab4ecc1c92ec2b5d"} Mar 20 11:44:02 crc kubenswrapper[4695]: I0320 11:44:02.706691 4695 generic.go:334] "Generic (PLEG): container finished" podID="a8a0bbf6-76f8-425b-bd9a-d834207d3ac0" containerID="e6bcfd88e3748f635146c21ab6125c935129598a21c6023646137cad9034dd07" exitCode=0 Mar 20 11:44:02 crc kubenswrapper[4695]: I0320 11:44:02.706801 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" event={"ID":"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0","Type":"ContainerDied","Data":"e6bcfd88e3748f635146c21ab6125c935129598a21c6023646137cad9034dd07"} Mar 20 11:44:03 crc kubenswrapper[4695]: I0320 11:44:03.994706 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:04 crc kubenswrapper[4695]: I0320 11:44:04.094788 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtcvj\" (UniqueName: \"kubernetes.io/projected/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0-kube-api-access-wtcvj\") pod \"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0\" (UID: \"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0\") " Mar 20 11:44:04 crc kubenswrapper[4695]: I0320 11:44:04.102679 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0-kube-api-access-wtcvj" (OuterVolumeSpecName: "kube-api-access-wtcvj") pod "a8a0bbf6-76f8-425b-bd9a-d834207d3ac0" (UID: "a8a0bbf6-76f8-425b-bd9a-d834207d3ac0"). InnerVolumeSpecName "kube-api-access-wtcvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:04 crc kubenswrapper[4695]: I0320 11:44:04.197705 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtcvj\" (UniqueName: \"kubernetes.io/projected/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0-kube-api-access-wtcvj\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:04 crc kubenswrapper[4695]: I0320 11:44:04.725877 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" event={"ID":"a8a0bbf6-76f8-425b-bd9a-d834207d3ac0","Type":"ContainerDied","Data":"2616c1cabec4d947ad8e3ee387bd842dee051f9a674e3b7eab4ecc1c92ec2b5d"} Mar 20 11:44:04 crc kubenswrapper[4695]: I0320 11:44:04.725951 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2616c1cabec4d947ad8e3ee387bd842dee051f9a674e3b7eab4ecc1c92ec2b5d" Mar 20 11:44:04 crc kubenswrapper[4695]: I0320 11:44:04.726396 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-nsc7l" Mar 20 11:44:05 crc kubenswrapper[4695]: I0320 11:44:05.072514 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-xn2gw"] Mar 20 11:44:05 crc kubenswrapper[4695]: I0320 11:44:05.077538 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-xn2gw"] Mar 20 11:44:06 crc kubenswrapper[4695]: I0320 11:44:06.900347 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="715c3ab3-d767-4433-aac3-fbb7a527f63a" path="/var/lib/kubelet/pods/715c3ab3-d767-4433-aac3-fbb7a527f63a/volumes" Mar 20 11:44:07 crc kubenswrapper[4695]: I0320 11:44:07.306836 4695 scope.go:117] "RemoveContainer" containerID="1950b28e687eeb720c55fa7ed6ec631462f740c9144ccf4b0a43afef767560dd" Mar 20 11:44:07 crc kubenswrapper[4695]: I0320 11:44:07.887683 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:44:07 crc kubenswrapper[4695]: E0320 11:44:07.887995 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:44:19 crc kubenswrapper[4695]: I0320 11:44:19.887889 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:44:19 crc kubenswrapper[4695]: E0320 11:44:19.888971 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:44:33 crc kubenswrapper[4695]: I0320 11:44:33.887024 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:44:33 crc kubenswrapper[4695]: E0320 11:44:33.888062 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.688361 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fg4nh/must-gather-vlbwx"] Mar 20 11:44:48 crc kubenswrapper[4695]: E0320 11:44:48.689466 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8a0bbf6-76f8-425b-bd9a-d834207d3ac0" containerName="oc" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.689482 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8a0bbf6-76f8-425b-bd9a-d834207d3ac0" containerName="oc" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.689661 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8a0bbf6-76f8-425b-bd9a-d834207d3ac0" containerName="oc" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.690616 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.693424 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fg4nh"/"openshift-service-ca.crt" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.693686 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fg4nh"/"kube-root-ca.crt" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.719871 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fg4nh/must-gather-vlbwx"] Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.785367 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbt2h\" (UniqueName: \"kubernetes.io/projected/e768c601-230c-4ce1-b2a0-d65f6de22091-kube-api-access-mbt2h\") pod \"must-gather-vlbwx\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.785416 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e768c601-230c-4ce1-b2a0-d65f6de22091-must-gather-output\") pod \"must-gather-vlbwx\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.886666 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:44:48 crc kubenswrapper[4695]: E0320 11:44:48.887052 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.887203 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbt2h\" (UniqueName: \"kubernetes.io/projected/e768c601-230c-4ce1-b2a0-d65f6de22091-kube-api-access-mbt2h\") pod \"must-gather-vlbwx\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.887242 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e768c601-230c-4ce1-b2a0-d65f6de22091-must-gather-output\") pod \"must-gather-vlbwx\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.888094 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e768c601-230c-4ce1-b2a0-d65f6de22091-must-gather-output\") pod \"must-gather-vlbwx\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:48 crc kubenswrapper[4695]: I0320 11:44:48.910023 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbt2h\" (UniqueName: \"kubernetes.io/projected/e768c601-230c-4ce1-b2a0-d65f6de22091-kube-api-access-mbt2h\") pod \"must-gather-vlbwx\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:49 crc kubenswrapper[4695]: I0320 11:44:49.008670 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:44:49 crc kubenswrapper[4695]: I0320 11:44:49.507616 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fg4nh/must-gather-vlbwx"] Mar 20 11:44:50 crc kubenswrapper[4695]: I0320 11:44:50.052946 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" event={"ID":"e768c601-230c-4ce1-b2a0-d65f6de22091","Type":"ContainerStarted","Data":"bfe968fc1531b3487c8c5fc9ccaed06520cda300f2c932e94a09cd212d5b1f35"} Mar 20 11:44:57 crc kubenswrapper[4695]: I0320 11:44:57.108468 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" event={"ID":"e768c601-230c-4ce1-b2a0-d65f6de22091","Type":"ContainerStarted","Data":"297e81a06c4a6de58ae0902e4ee54b7011de9bcaff688007c6ddf9930c7e7390"} Mar 20 11:44:57 crc kubenswrapper[4695]: I0320 11:44:57.109162 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" event={"ID":"e768c601-230c-4ce1-b2a0-d65f6de22091","Type":"ContainerStarted","Data":"2921c4a64e414d4fb0a1c8d35fee1340f32ebba7a6e9cea18cac19c62979e7d7"} Mar 20 11:44:57 crc kubenswrapper[4695]: I0320 11:44:57.129700 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" podStartSLOduration=2.517124434 podStartE2EDuration="9.129677813s" podCreationTimestamp="2026-03-20 11:44:48 +0000 UTC" firstStartedPulling="2026-03-20 11:44:49.514817352 +0000 UTC m=+3067.295422915" lastFinishedPulling="2026-03-20 11:44:56.127370731 +0000 UTC m=+3073.907976294" observedRunningTime="2026-03-20 11:44:57.128802331 +0000 UTC m=+3074.909407894" watchObservedRunningTime="2026-03-20 11:44:57.129677813 +0000 UTC m=+3074.910283376" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.149137 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf"] Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.151500 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.156881 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.164811 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.190431 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf"] Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.279050 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d34a2236-7162-4eca-8292-e07159776e80-config-volume\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.279522 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d34a2236-7162-4eca-8292-e07159776e80-secret-volume\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.279559 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjkg7\" (UniqueName: \"kubernetes.io/projected/d34a2236-7162-4eca-8292-e07159776e80-kube-api-access-gjkg7\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.381469 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d34a2236-7162-4eca-8292-e07159776e80-secret-volume\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.381793 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjkg7\" (UniqueName: \"kubernetes.io/projected/d34a2236-7162-4eca-8292-e07159776e80-kube-api-access-gjkg7\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.381904 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d34a2236-7162-4eca-8292-e07159776e80-config-volume\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.383221 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d34a2236-7162-4eca-8292-e07159776e80-config-volume\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.403181 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d34a2236-7162-4eca-8292-e07159776e80-secret-volume\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.406222 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjkg7\" (UniqueName: \"kubernetes.io/projected/d34a2236-7162-4eca-8292-e07159776e80-kube-api-access-gjkg7\") pod \"collect-profiles-29566785-9fnkf\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.482117 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:00 crc kubenswrapper[4695]: I0320 11:45:00.957668 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf"] Mar 20 11:45:00 crc kubenswrapper[4695]: W0320 11:45:00.965452 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd34a2236_7162_4eca_8292_e07159776e80.slice/crio-e55a7bb5818cc204b02d78c9affffe63226c4898ac126e6dddd6fbf767ab3d5e WatchSource:0}: Error finding container e55a7bb5818cc204b02d78c9affffe63226c4898ac126e6dddd6fbf767ab3d5e: Status 404 returned error can't find the container with id e55a7bb5818cc204b02d78c9affffe63226c4898ac126e6dddd6fbf767ab3d5e Mar 20 11:45:01 crc kubenswrapper[4695]: I0320 11:45:01.136335 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" event={"ID":"d34a2236-7162-4eca-8292-e07159776e80","Type":"ContainerStarted","Data":"e55a7bb5818cc204b02d78c9affffe63226c4898ac126e6dddd6fbf767ab3d5e"} Mar 20 11:45:01 crc kubenswrapper[4695]: I0320 11:45:01.887175 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:45:01 crc kubenswrapper[4695]: E0320 11:45:01.887830 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:45:02 crc kubenswrapper[4695]: I0320 11:45:02.144064 4695 generic.go:334] "Generic (PLEG): container finished" podID="d34a2236-7162-4eca-8292-e07159776e80" containerID="b246efa5758300b68d7eececcc921c9e386c182b3c6572bc07759980a1ecbc17" exitCode=0 Mar 20 11:45:02 crc kubenswrapper[4695]: I0320 11:45:02.144112 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" event={"ID":"d34a2236-7162-4eca-8292-e07159776e80","Type":"ContainerDied","Data":"b246efa5758300b68d7eececcc921c9e386c182b3c6572bc07759980a1ecbc17"} Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.598360 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.739350 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d34a2236-7162-4eca-8292-e07159776e80-secret-volume\") pod \"d34a2236-7162-4eca-8292-e07159776e80\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.739443 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjkg7\" (UniqueName: \"kubernetes.io/projected/d34a2236-7162-4eca-8292-e07159776e80-kube-api-access-gjkg7\") pod \"d34a2236-7162-4eca-8292-e07159776e80\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.739527 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d34a2236-7162-4eca-8292-e07159776e80-config-volume\") pod \"d34a2236-7162-4eca-8292-e07159776e80\" (UID: \"d34a2236-7162-4eca-8292-e07159776e80\") " Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.740766 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d34a2236-7162-4eca-8292-e07159776e80-config-volume" (OuterVolumeSpecName: "config-volume") pod "d34a2236-7162-4eca-8292-e07159776e80" (UID: "d34a2236-7162-4eca-8292-e07159776e80"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.746315 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34a2236-7162-4eca-8292-e07159776e80-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d34a2236-7162-4eca-8292-e07159776e80" (UID: "d34a2236-7162-4eca-8292-e07159776e80"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.753287 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34a2236-7162-4eca-8292-e07159776e80-kube-api-access-gjkg7" (OuterVolumeSpecName: "kube-api-access-gjkg7") pod "d34a2236-7162-4eca-8292-e07159776e80" (UID: "d34a2236-7162-4eca-8292-e07159776e80"). InnerVolumeSpecName "kube-api-access-gjkg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.840834 4695 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d34a2236-7162-4eca-8292-e07159776e80-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.840879 4695 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d34a2236-7162-4eca-8292-e07159776e80-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4695]: I0320 11:45:03.840891 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gjkg7\" (UniqueName: \"kubernetes.io/projected/d34a2236-7162-4eca-8292-e07159776e80-kube-api-access-gjkg7\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:04 crc kubenswrapper[4695]: I0320 11:45:04.160567 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" event={"ID":"d34a2236-7162-4eca-8292-e07159776e80","Type":"ContainerDied","Data":"e55a7bb5818cc204b02d78c9affffe63226c4898ac126e6dddd6fbf767ab3d5e"} Mar 20 11:45:04 crc kubenswrapper[4695]: I0320 11:45:04.160620 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e55a7bb5818cc204b02d78c9affffe63226c4898ac126e6dddd6fbf767ab3d5e" Mar 20 11:45:04 crc kubenswrapper[4695]: I0320 11:45:04.160997 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9fnkf" Mar 20 11:45:04 crc kubenswrapper[4695]: I0320 11:45:04.680672 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj"] Mar 20 11:45:04 crc kubenswrapper[4695]: I0320 11:45:04.687299 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-82jvj"] Mar 20 11:45:04 crc kubenswrapper[4695]: I0320 11:45:04.906512 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940bd5a7-5746-4008-9f5a-84a7c0dc4c15" path="/var/lib/kubelet/pods/940bd5a7-5746-4008-9f5a-84a7c0dc4c15/volumes" Mar 20 11:45:07 crc kubenswrapper[4695]: I0320 11:45:07.384471 4695 scope.go:117] "RemoveContainer" containerID="d4befc3dbecc6e35a03b7414591ca6efd36fab355e376ecdfd0caaaba1353d85" Mar 20 11:45:13 crc kubenswrapper[4695]: I0320 11:45:13.887016 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:45:13 crc kubenswrapper[4695]: E0320 11:45:13.887866 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:45:26 crc kubenswrapper[4695]: I0320 11:45:26.887343 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:45:26 crc kubenswrapper[4695]: E0320 11:45:26.888632 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:45:38 crc kubenswrapper[4695]: I0320 11:45:38.887242 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:45:38 crc kubenswrapper[4695]: E0320 11:45:38.890034 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:45:41 crc kubenswrapper[4695]: I0320 11:45:41.997447 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-nvpmk_c0907202-d2f3-4249-805b-1fcb750f56af/init/0.log" Mar 20 11:45:42 crc kubenswrapper[4695]: I0320 11:45:42.249868 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-nvpmk_c0907202-d2f3-4249-805b-1fcb750f56af/init/0.log" Mar 20 11:45:42 crc kubenswrapper[4695]: I0320 11:45:42.255685 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-nvpmk_c0907202-d2f3-4249-805b-1fcb750f56af/dnsmasq-dns/0.log" Mar 20 11:45:53 crc kubenswrapper[4695]: I0320 11:45:53.887447 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:45:53 crc kubenswrapper[4695]: E0320 11:45:53.888273 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.390721 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/util/0.log" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.699073 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/util/0.log" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.753942 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/pull/0.log" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.801836 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/pull/0.log" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.949690 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/extract/0.log" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.955645 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/pull/0.log" Mar 20 11:45:56 crc kubenswrapper[4695]: I0320 11:45:56.971088 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b0858rwtbn_343c2a1f-026f-4fd8-811f-790a957c3c82/util/0.log" Mar 20 11:45:57 crc kubenswrapper[4695]: I0320 11:45:57.150962 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-l29bw_c4f4cf94-cced-45aa-9d30-2a60e6a9e291/manager/0.log" Mar 20 11:45:57 crc kubenswrapper[4695]: I0320 11:45:57.412767 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-5t427_58250fd6-7e5e-429d-907a-c0f2725f029f/manager/0.log" Mar 20 11:45:57 crc kubenswrapper[4695]: I0320 11:45:57.466834 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-cg8tn_d46fd923-64a9-48cf-b3ea-05d6a676d7e1/manager/0.log" Mar 20 11:45:57 crc kubenswrapper[4695]: I0320 11:45:57.767970 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-sjz6g_99970094-eb53-4489-ba1d-1f650470c848/manager/0.log" Mar 20 11:45:57 crc kubenswrapper[4695]: I0320 11:45:57.781070 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-g8s68_a6a059b0-61e0-4592-8661-e480f9573c66/manager/0.log" Mar 20 11:45:57 crc kubenswrapper[4695]: I0320 11:45:57.977837 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-669fff9c7c-54mmr_fd90b802-5cbc-4d48-a76a-2903fab33ef0/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.009546 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-2l9vk_bef240d4-6041-44e9-8228-f707a5f2f8eb/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.205961 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-ws2ch_9a8f730f-c9f3-4467-8b90-cfddd028ee71/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.213662 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-tcghf_39cbf988-66c1-4ac9-9595-3cf263cde0aa/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.378057 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-w62g8_5ce3aca1-15ad-43a5-be8f-b7c5580fcb59/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.414368 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-m8kxp_7e6a711c-3208-459c-9f80-f29d5bbd0177/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.555405 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-z62m6_d2a6f843-5ef0-48d2-9582-4c56551531a9/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.666611 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-zcgwb_9ab1dbad-3ea8-4ed7-9284-d27f1516c26c/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.761116 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-4q99q_26fc4733-51dc-4a8d-ba8e-03bd966cac17/manager/0.log" Mar 20 11:45:58 crc kubenswrapper[4695]: I0320 11:45:58.891921 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-m4c8w_16ad72ba-9b7f-47fc-8216-147e439de734/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.165343 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6697dffbc-5w2r2_53c876bd-49e1-4e7c-9673-91ebcd6b19a0/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.166995 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-846ffbb776-b2zpq_b8023be3-6ce2-4167-bda0-378d061db8ac/operator/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.318123 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-vqtzk_c83f3723-7376-428e-b64a-3c081f0dab01/registry-server/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.370896 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-lnj2n_4a3d1db2-ad95-4c29-9d6f-6d6d5dfb1a1a/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.527615 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-hxjpw_0b97d7e0-8c0d-425f-927c-cdf926f3b9fb/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.619959 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-zs6kl_a7f91c18-4219-4040-b474-4d38d377071a/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.702308 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-qwltk_23e52d31-9d42-439d-95a1-1761dee30f57/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.828688 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-njjcr_498749d1-4031-4083-bb7b-e2640519795e/manager/0.log" Mar 20 11:45:59 crc kubenswrapper[4695]: I0320 11:45:59.922162 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-hvnnc_f4807ed5-3bee-42a3-a23b-e7473fc1b833/manager/0.log" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.144148 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566786-2qpvb"] Mar 20 11:46:00 crc kubenswrapper[4695]: E0320 11:46:00.144580 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34a2236-7162-4eca-8292-e07159776e80" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.144597 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34a2236-7162-4eca-8292-e07159776e80" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.144807 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34a2236-7162-4eca-8292-e07159776e80" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.145416 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.149033 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.149206 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.149274 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.155755 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-2qpvb"] Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.326400 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8g4c2\" (UniqueName: \"kubernetes.io/projected/5c827f5a-671e-4fbd-acac-bb6029906f10-kube-api-access-8g4c2\") pod \"auto-csr-approver-29566786-2qpvb\" (UID: \"5c827f5a-671e-4fbd-acac-bb6029906f10\") " pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.429542 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8g4c2\" (UniqueName: \"kubernetes.io/projected/5c827f5a-671e-4fbd-acac-bb6029906f10-kube-api-access-8g4c2\") pod \"auto-csr-approver-29566786-2qpvb\" (UID: \"5c827f5a-671e-4fbd-acac-bb6029906f10\") " pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.460723 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8g4c2\" (UniqueName: \"kubernetes.io/projected/5c827f5a-671e-4fbd-acac-bb6029906f10-kube-api-access-8g4c2\") pod \"auto-csr-approver-29566786-2qpvb\" (UID: \"5c827f5a-671e-4fbd-acac-bb6029906f10\") " pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:00 crc kubenswrapper[4695]: I0320 11:46:00.638773 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:01 crc kubenswrapper[4695]: I0320 11:46:01.315404 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-2qpvb"] Mar 20 11:46:01 crc kubenswrapper[4695]: I0320 11:46:01.599506 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" event={"ID":"5c827f5a-671e-4fbd-acac-bb6029906f10","Type":"ContainerStarted","Data":"e999b74b10dc41cb9a49e32a847620655a3179aba19d20f4dfec7c1b75df1ac2"} Mar 20 11:46:03 crc kubenswrapper[4695]: I0320 11:46:03.619613 4695 generic.go:334] "Generic (PLEG): container finished" podID="5c827f5a-671e-4fbd-acac-bb6029906f10" containerID="0697f9b13d8e10ac3e3ad68e61de3146506c09e3896dd149535a3744d87786d0" exitCode=0 Mar 20 11:46:03 crc kubenswrapper[4695]: I0320 11:46:03.619676 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" event={"ID":"5c827f5a-671e-4fbd-acac-bb6029906f10","Type":"ContainerDied","Data":"0697f9b13d8e10ac3e3ad68e61de3146506c09e3896dd149535a3744d87786d0"} Mar 20 11:46:04 crc kubenswrapper[4695]: I0320 11:46:04.952252 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:04 crc kubenswrapper[4695]: I0320 11:46:04.977811 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8g4c2\" (UniqueName: \"kubernetes.io/projected/5c827f5a-671e-4fbd-acac-bb6029906f10-kube-api-access-8g4c2\") pod \"5c827f5a-671e-4fbd-acac-bb6029906f10\" (UID: \"5c827f5a-671e-4fbd-acac-bb6029906f10\") " Mar 20 11:46:04 crc kubenswrapper[4695]: I0320 11:46:04.991152 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c827f5a-671e-4fbd-acac-bb6029906f10-kube-api-access-8g4c2" (OuterVolumeSpecName: "kube-api-access-8g4c2") pod "5c827f5a-671e-4fbd-acac-bb6029906f10" (UID: "5c827f5a-671e-4fbd-acac-bb6029906f10"). InnerVolumeSpecName "kube-api-access-8g4c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:46:05 crc kubenswrapper[4695]: I0320 11:46:05.079559 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8g4c2\" (UniqueName: \"kubernetes.io/projected/5c827f5a-671e-4fbd-acac-bb6029906f10-kube-api-access-8g4c2\") on node \"crc\" DevicePath \"\"" Mar 20 11:46:05 crc kubenswrapper[4695]: I0320 11:46:05.750183 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" event={"ID":"5c827f5a-671e-4fbd-acac-bb6029906f10","Type":"ContainerDied","Data":"e999b74b10dc41cb9a49e32a847620655a3179aba19d20f4dfec7c1b75df1ac2"} Mar 20 11:46:05 crc kubenswrapper[4695]: I0320 11:46:05.750245 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e999b74b10dc41cb9a49e32a847620655a3179aba19d20f4dfec7c1b75df1ac2" Mar 20 11:46:05 crc kubenswrapper[4695]: I0320 11:46:05.750359 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-2qpvb" Mar 20 11:46:06 crc kubenswrapper[4695]: I0320 11:46:06.026448 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-p8xjb"] Mar 20 11:46:06 crc kubenswrapper[4695]: I0320 11:46:06.032967 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-p8xjb"] Mar 20 11:46:06 crc kubenswrapper[4695]: I0320 11:46:06.932058 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:46:06 crc kubenswrapper[4695]: E0320 11:46:06.932310 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:46:06 crc kubenswrapper[4695]: I0320 11:46:06.944329 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a" path="/var/lib/kubelet/pods/9c3f1296-6ed3-4ade-bfe0-6e9c880b9f1a/volumes" Mar 20 11:46:07 crc kubenswrapper[4695]: I0320 11:46:07.429523 4695 scope.go:117] "RemoveContainer" containerID="3331d7637395abfd19c3dbdfcadee04e194eaac23fbefc5ff9b5688c6575291b" Mar 20 11:46:19 crc kubenswrapper[4695]: I0320 11:46:19.886958 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:46:19 crc kubenswrapper[4695]: E0320 11:46:19.889251 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:46:21 crc kubenswrapper[4695]: I0320 11:46:21.258118 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-lkn7q_95e09f08-a8b1-4873-80a7-cfeab3b3f3b6/control-plane-machine-set-operator/0.log" Mar 20 11:46:21 crc kubenswrapper[4695]: I0320 11:46:21.466854 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-92bhx_0cd0f715-9ef8-4e6c-8e56-8e17bd66d882/kube-rbac-proxy/0.log" Mar 20 11:46:21 crc kubenswrapper[4695]: I0320 11:46:21.485170 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-92bhx_0cd0f715-9ef8-4e6c-8e56-8e17bd66d882/machine-api-operator/0.log" Mar 20 11:46:33 crc kubenswrapper[4695]: I0320 11:46:33.887149 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:46:33 crc kubenswrapper[4695]: E0320 11:46:33.887870 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:46:35 crc kubenswrapper[4695]: I0320 11:46:35.668305 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-5w452_bbe529d1-f905-4175-b7e7-44aa52a9cfcf/cert-manager-controller/0.log" Mar 20 11:46:35 crc kubenswrapper[4695]: I0320 11:46:35.833373 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-q6j7h_87d5459b-d306-4870-a3bd-d41bceeb6b50/cert-manager-cainjector/0.log" Mar 20 11:46:35 crc kubenswrapper[4695]: I0320 11:46:35.890791 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-tgcp4_0756f4f4-4178-4716-b64e-33302d72d6de/cert-manager-webhook/0.log" Mar 20 11:46:44 crc kubenswrapper[4695]: I0320 11:46:44.887418 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:46:44 crc kubenswrapper[4695]: E0320 11:46:44.890382 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:46:49 crc kubenswrapper[4695]: I0320 11:46:49.650837 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-vs8wz_812dca14-f391-4297-831b-007e693807b2/nmstate-console-plugin/0.log" Mar 20 11:46:49 crc kubenswrapper[4695]: I0320 11:46:49.833623 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-bcl9w_136cb8fe-69bb-40ec-badd-ccb28dbf8a49/nmstate-handler/0.log" Mar 20 11:46:49 crc kubenswrapper[4695]: I0320 11:46:49.869428 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-dc646_935a2ea2-68f9-4295-ab4d-53eb5403edfc/kube-rbac-proxy/0.log" Mar 20 11:46:49 crc kubenswrapper[4695]: I0320 11:46:49.961812 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-dc646_935a2ea2-68f9-4295-ab4d-53eb5403edfc/nmstate-metrics/0.log" Mar 20 11:46:50 crc kubenswrapper[4695]: I0320 11:46:50.091136 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-7xvf2_1fd45e0f-120d-45c8-99c5-f8fc5e043df6/nmstate-operator/0.log" Mar 20 11:46:50 crc kubenswrapper[4695]: I0320 11:46:50.164020 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-2wb9n_88d329fc-eca2-4b78-8aa7-af0444906ebf/nmstate-webhook/0.log" Mar 20 11:46:56 crc kubenswrapper[4695]: I0320 11:46:56.891779 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:46:56 crc kubenswrapper[4695]: E0320 11:46:56.892816 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:47:09 crc kubenswrapper[4695]: I0320 11:47:09.887696 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:47:09 crc kubenswrapper[4695]: E0320 11:47:09.895823 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.259608 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-clkw8_e3f91838-e214-4456-880e-642525f5c721/kube-rbac-proxy/0.log" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.365592 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-clkw8_e3f91838-e214-4456-880e-642525f5c721/controller/0.log" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.564075 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-frr-files/0.log" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.745570 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-frr-files/0.log" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.746712 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-reloader/0.log" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.752390 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-metrics/0.log" Mar 20 11:47:20 crc kubenswrapper[4695]: I0320 11:47:20.823808 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-reloader/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.046691 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-frr-files/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.055511 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-metrics/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.055673 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-metrics/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.089190 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-reloader/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.267542 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-frr-files/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.295408 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-metrics/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.310851 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/controller/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.313741 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/cp-reloader/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.844540 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/frr-metrics/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.887800 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/kube-rbac-proxy/0.log" Mar 20 11:47:21 crc kubenswrapper[4695]: I0320 11:47:21.920249 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/kube-rbac-proxy-frr/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.062642 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/reloader/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.184312 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-gjhk4_bd87a6c8-0020-4bfd-b9fb-f75010d387e8/frr-k8s-webhook-server/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.186646 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-kkzsx_0fbeb433-d5f6-4984-a464-2783df4ccd70/frr/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.437369 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-774ff5745b-6kwgs_29bed4c6-fa24-4948-b267-ad2f5827d72f/webhook-server/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.448157 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58888569cf-fq55h_d154e4d7-d533-4e4a-801a-b5db6b39b5f9/manager/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.629671 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b88xw_2a5cc6f9-e338-4a31-a0ed-b29ed14383cf/kube-rbac-proxy/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.878038 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-b88xw_2a5cc6f9-e338-4a31-a0ed-b29ed14383cf/speaker/0.log" Mar 20 11:47:22 crc kubenswrapper[4695]: I0320 11:47:22.892771 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:47:22 crc kubenswrapper[4695]: E0320 11:47:22.893071 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:47:34 crc kubenswrapper[4695]: I0320 11:47:34.888152 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:47:34 crc kubenswrapper[4695]: E0320 11:47:34.889205 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.062993 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/util/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.263095 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/util/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.312067 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/pull/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.339130 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/pull/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.488865 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/util/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.512843 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/pull/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.571137 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874ljrsq_dc81d3d9-b38c-47dd-9429-21ea474dd393/extract/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.724235 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/util/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.889048 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/util/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.897706 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/pull/0.log" Mar 20 11:47:37 crc kubenswrapper[4695]: I0320 11:47:37.899203 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/pull/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.406094 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/extract/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.417506 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/util/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.420279 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xrrj8_ed0aecb5-8bed-4a2b-af31-67cb15eeee8d/pull/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.597139 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/extract-utilities/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.799096 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/extract-utilities/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.800956 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/extract-content/0.log" Mar 20 11:47:38 crc kubenswrapper[4695]: I0320 11:47:38.836072 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/extract-content/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.085786 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/extract-content/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.095541 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/extract-utilities/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.274354 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/extract-utilities/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.530747 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wd4q4_022dc941-0ef6-4062-9089-3b4fd1c2e404/registry-server/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.621270 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/extract-content/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.660069 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/extract-utilities/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.665637 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/extract-content/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.835829 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/extract-utilities/0.log" Mar 20 11:47:39 crc kubenswrapper[4695]: I0320 11:47:39.853184 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/extract-content/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.197236 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-5vxrg_ec6419ea-4cf9-415f-8aba-c775cd497980/marketplace-operator/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.251896 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/extract-utilities/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.283077 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-5kng9_7e3b7582-d5fb-4abc-91f1-68097a188855/registry-server/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.467885 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/extract-utilities/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.484123 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/extract-content/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.556553 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/extract-content/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.687637 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/extract-utilities/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.710605 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/extract-content/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.851308 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-slr2w_c0a73066-5eca-4060-b323-4f8bf64b4a6c/registry-server/0.log" Mar 20 11:47:40 crc kubenswrapper[4695]: I0320 11:47:40.927543 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/extract-utilities/0.log" Mar 20 11:47:41 crc kubenswrapper[4695]: I0320 11:47:41.130625 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/extract-content/0.log" Mar 20 11:47:41 crc kubenswrapper[4695]: I0320 11:47:41.151090 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/extract-utilities/0.log" Mar 20 11:47:41 crc kubenswrapper[4695]: I0320 11:47:41.177141 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/extract-content/0.log" Mar 20 11:47:41 crc kubenswrapper[4695]: I0320 11:47:41.382108 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/extract-utilities/0.log" Mar 20 11:47:41 crc kubenswrapper[4695]: I0320 11:47:41.422702 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/extract-content/0.log" Mar 20 11:47:41 crc kubenswrapper[4695]: I0320 11:47:41.963954 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-865c5_2b4f2e1f-35d1-4755-9e68-9a57f30f9423/registry-server/0.log" Mar 20 11:47:47 crc kubenswrapper[4695]: I0320 11:47:47.886797 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:47:47 crc kubenswrapper[4695]: E0320 11:47:47.889559 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:47:58 crc kubenswrapper[4695]: I0320 11:47:58.888845 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:47:58 crc kubenswrapper[4695]: E0320 11:47:58.889987 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.188513 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566788-tp8xm"] Mar 20 11:48:00 crc kubenswrapper[4695]: E0320 11:48:00.189445 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c827f5a-671e-4fbd-acac-bb6029906f10" containerName="oc" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.189464 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c827f5a-671e-4fbd-acac-bb6029906f10" containerName="oc" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.189786 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c827f5a-671e-4fbd-acac-bb6029906f10" containerName="oc" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.190885 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.194448 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.194705 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.194880 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.199607 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-tp8xm"] Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.389611 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g87xr\" (UniqueName: \"kubernetes.io/projected/8da69ccf-5ad7-4cda-9c49-f949b714d1be-kube-api-access-g87xr\") pod \"auto-csr-approver-29566788-tp8xm\" (UID: \"8da69ccf-5ad7-4cda-9c49-f949b714d1be\") " pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.491265 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g87xr\" (UniqueName: \"kubernetes.io/projected/8da69ccf-5ad7-4cda-9c49-f949b714d1be-kube-api-access-g87xr\") pod \"auto-csr-approver-29566788-tp8xm\" (UID: \"8da69ccf-5ad7-4cda-9c49-f949b714d1be\") " pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.514490 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g87xr\" (UniqueName: \"kubernetes.io/projected/8da69ccf-5ad7-4cda-9c49-f949b714d1be-kube-api-access-g87xr\") pod \"auto-csr-approver-29566788-tp8xm\" (UID: \"8da69ccf-5ad7-4cda-9c49-f949b714d1be\") " pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:00 crc kubenswrapper[4695]: I0320 11:48:00.813455 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:01 crc kubenswrapper[4695]: I0320 11:48:01.294957 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-tp8xm"] Mar 20 11:48:02 crc kubenswrapper[4695]: I0320 11:48:02.175639 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" event={"ID":"8da69ccf-5ad7-4cda-9c49-f949b714d1be","Type":"ContainerStarted","Data":"395be29b637f7a8fd9176119fb460ce54fd8d8552a0f7f73ba3189c35737d772"} Mar 20 11:48:03 crc kubenswrapper[4695]: I0320 11:48:03.184166 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" event={"ID":"8da69ccf-5ad7-4cda-9c49-f949b714d1be","Type":"ContainerStarted","Data":"99a1063d052ca4a31c03aa7ed7fa88acc7ac8ffa3476fc810182dd33859cf54a"} Mar 20 11:48:04 crc kubenswrapper[4695]: I0320 11:48:04.192663 4695 generic.go:334] "Generic (PLEG): container finished" podID="8da69ccf-5ad7-4cda-9c49-f949b714d1be" containerID="99a1063d052ca4a31c03aa7ed7fa88acc7ac8ffa3476fc810182dd33859cf54a" exitCode=0 Mar 20 11:48:04 crc kubenswrapper[4695]: I0320 11:48:04.192815 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" event={"ID":"8da69ccf-5ad7-4cda-9c49-f949b714d1be","Type":"ContainerDied","Data":"99a1063d052ca4a31c03aa7ed7fa88acc7ac8ffa3476fc810182dd33859cf54a"} Mar 20 11:48:05 crc kubenswrapper[4695]: I0320 11:48:05.506299 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:05 crc kubenswrapper[4695]: I0320 11:48:05.677022 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g87xr\" (UniqueName: \"kubernetes.io/projected/8da69ccf-5ad7-4cda-9c49-f949b714d1be-kube-api-access-g87xr\") pod \"8da69ccf-5ad7-4cda-9c49-f949b714d1be\" (UID: \"8da69ccf-5ad7-4cda-9c49-f949b714d1be\") " Mar 20 11:48:05 crc kubenswrapper[4695]: I0320 11:48:05.699263 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8da69ccf-5ad7-4cda-9c49-f949b714d1be-kube-api-access-g87xr" (OuterVolumeSpecName: "kube-api-access-g87xr") pod "8da69ccf-5ad7-4cda-9c49-f949b714d1be" (UID: "8da69ccf-5ad7-4cda-9c49-f949b714d1be"). InnerVolumeSpecName "kube-api-access-g87xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:48:05 crc kubenswrapper[4695]: I0320 11:48:05.779469 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g87xr\" (UniqueName: \"kubernetes.io/projected/8da69ccf-5ad7-4cda-9c49-f949b714d1be-kube-api-access-g87xr\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:06 crc kubenswrapper[4695]: I0320 11:48:06.091971 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-jckgt"] Mar 20 11:48:06 crc kubenswrapper[4695]: I0320 11:48:06.143435 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-jckgt"] Mar 20 11:48:06 crc kubenswrapper[4695]: I0320 11:48:06.222880 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" event={"ID":"8da69ccf-5ad7-4cda-9c49-f949b714d1be","Type":"ContainerDied","Data":"395be29b637f7a8fd9176119fb460ce54fd8d8552a0f7f73ba3189c35737d772"} Mar 20 11:48:06 crc kubenswrapper[4695]: I0320 11:48:06.222950 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="395be29b637f7a8fd9176119fb460ce54fd8d8552a0f7f73ba3189c35737d772" Mar 20 11:48:06 crc kubenswrapper[4695]: I0320 11:48:06.223091 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-tp8xm" Mar 20 11:48:06 crc kubenswrapper[4695]: I0320 11:48:06.898385 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b18780-f249-4ce5-928e-4ede3fde4360" path="/var/lib/kubelet/pods/93b18780-f249-4ce5-928e-4ede3fde4360/volumes" Mar 20 11:48:07 crc kubenswrapper[4695]: I0320 11:48:07.523841 4695 scope.go:117] "RemoveContainer" containerID="5ddb32697daba259741bc8d70ef818ef1c6f1db478d1eab954d023d041ff418b" Mar 20 11:48:12 crc kubenswrapper[4695]: I0320 11:48:12.891765 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:48:12 crc kubenswrapper[4695]: E0320 11:48:12.892850 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:48:24 crc kubenswrapper[4695]: I0320 11:48:24.887034 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:48:24 crc kubenswrapper[4695]: E0320 11:48:24.887962 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:48:35 crc kubenswrapper[4695]: I0320 11:48:35.887837 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:48:35 crc kubenswrapper[4695]: E0320 11:48:35.888805 4695 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-bnwz5_openshift-machine-config-operator(7859c924-84d7-4855-901e-c77a02c56e3a)\"" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" Mar 20 11:48:46 crc kubenswrapper[4695]: I0320 11:48:46.222362 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:48:47 crc kubenswrapper[4695]: I0320 11:48:47.615235 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"7a3ae36c199c203aaa8df730b6a5da70999a35dbcf7d29a69b9177c6281627a5"} Mar 20 11:49:05 crc kubenswrapper[4695]: I0320 11:49:05.758974 4695 generic.go:334] "Generic (PLEG): container finished" podID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerID="2921c4a64e414d4fb0a1c8d35fee1340f32ebba7a6e9cea18cac19c62979e7d7" exitCode=0 Mar 20 11:49:05 crc kubenswrapper[4695]: I0320 11:49:05.759064 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" event={"ID":"e768c601-230c-4ce1-b2a0-d65f6de22091","Type":"ContainerDied","Data":"2921c4a64e414d4fb0a1c8d35fee1340f32ebba7a6e9cea18cac19c62979e7d7"} Mar 20 11:49:05 crc kubenswrapper[4695]: I0320 11:49:05.760166 4695 scope.go:117] "RemoveContainer" containerID="2921c4a64e414d4fb0a1c8d35fee1340f32ebba7a6e9cea18cac19c62979e7d7" Mar 20 11:49:06 crc kubenswrapper[4695]: I0320 11:49:06.538440 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fg4nh_must-gather-vlbwx_e768c601-230c-4ce1-b2a0-d65f6de22091/gather/0.log" Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.489436 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fg4nh/must-gather-vlbwx"] Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.490321 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="copy" containerID="cri-o://297e81a06c4a6de58ae0902e4ee54b7011de9bcaff688007c6ddf9930c7e7390" gracePeriod=2 Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.496743 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fg4nh/must-gather-vlbwx"] Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.835299 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fg4nh_must-gather-vlbwx_e768c601-230c-4ce1-b2a0-d65f6de22091/copy/0.log" Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.838214 4695 generic.go:334] "Generic (PLEG): container finished" podID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerID="297e81a06c4a6de58ae0902e4ee54b7011de9bcaff688007c6ddf9930c7e7390" exitCode=143 Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.972686 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fg4nh_must-gather-vlbwx_e768c601-230c-4ce1-b2a0-d65f6de22091/copy/0.log" Mar 20 11:49:13 crc kubenswrapper[4695]: I0320 11:49:13.973509 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.131043 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e768c601-230c-4ce1-b2a0-d65f6de22091-must-gather-output\") pod \"e768c601-230c-4ce1-b2a0-d65f6de22091\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.131163 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbt2h\" (UniqueName: \"kubernetes.io/projected/e768c601-230c-4ce1-b2a0-d65f6de22091-kube-api-access-mbt2h\") pod \"e768c601-230c-4ce1-b2a0-d65f6de22091\" (UID: \"e768c601-230c-4ce1-b2a0-d65f6de22091\") " Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.137636 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e768c601-230c-4ce1-b2a0-d65f6de22091-kube-api-access-mbt2h" (OuterVolumeSpecName: "kube-api-access-mbt2h") pod "e768c601-230c-4ce1-b2a0-d65f6de22091" (UID: "e768c601-230c-4ce1-b2a0-d65f6de22091"). InnerVolumeSpecName "kube-api-access-mbt2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.234295 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbt2h\" (UniqueName: \"kubernetes.io/projected/e768c601-230c-4ce1-b2a0-d65f6de22091-kube-api-access-mbt2h\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.237270 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e768c601-230c-4ce1-b2a0-d65f6de22091-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e768c601-230c-4ce1-b2a0-d65f6de22091" (UID: "e768c601-230c-4ce1-b2a0-d65f6de22091"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.335265 4695 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e768c601-230c-4ce1-b2a0-d65f6de22091-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.850214 4695 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fg4nh_must-gather-vlbwx_e768c601-230c-4ce1-b2a0-d65f6de22091/copy/0.log" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.851003 4695 scope.go:117] "RemoveContainer" containerID="297e81a06c4a6de58ae0902e4ee54b7011de9bcaff688007c6ddf9930c7e7390" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.851127 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fg4nh/must-gather-vlbwx" Mar 20 11:49:14 crc kubenswrapper[4695]: I0320 11:49:14.882203 4695 scope.go:117] "RemoveContainer" containerID="2921c4a64e414d4fb0a1c8d35fee1340f32ebba7a6e9cea18cac19c62979e7d7" Mar 20 11:49:15 crc kubenswrapper[4695]: I0320 11:49:15.074546 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" path="/var/lib/kubelet/pods/e768c601-230c-4ce1-b2a0-d65f6de22091/volumes" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.159572 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566790-5tfd6"] Mar 20 11:50:00 crc kubenswrapper[4695]: E0320 11:50:00.161996 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="copy" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.162089 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="copy" Mar 20 11:50:00 crc kubenswrapper[4695]: E0320 11:50:00.162175 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="gather" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.162248 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="gather" Mar 20 11:50:00 crc kubenswrapper[4695]: E0320 11:50:00.162313 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8da69ccf-5ad7-4cda-9c49-f949b714d1be" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.162378 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="8da69ccf-5ad7-4cda-9c49-f949b714d1be" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.162591 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="8da69ccf-5ad7-4cda-9c49-f949b714d1be" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.162661 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="copy" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.163273 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="e768c601-230c-4ce1-b2a0-d65f6de22091" containerName="gather" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.164030 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.167593 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.167836 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.172953 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.175343 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-5tfd6"] Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.359066 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm45b\" (UniqueName: \"kubernetes.io/projected/0f2923e7-5697-4fc2-af21-f9015330dccf-kube-api-access-gm45b\") pod \"auto-csr-approver-29566790-5tfd6\" (UID: \"0f2923e7-5697-4fc2-af21-f9015330dccf\") " pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.460489 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm45b\" (UniqueName: \"kubernetes.io/projected/0f2923e7-5697-4fc2-af21-f9015330dccf-kube-api-access-gm45b\") pod \"auto-csr-approver-29566790-5tfd6\" (UID: \"0f2923e7-5697-4fc2-af21-f9015330dccf\") " pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.484827 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm45b\" (UniqueName: \"kubernetes.io/projected/0f2923e7-5697-4fc2-af21-f9015330dccf-kube-api-access-gm45b\") pod \"auto-csr-approver-29566790-5tfd6\" (UID: \"0f2923e7-5697-4fc2-af21-f9015330dccf\") " pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.495754 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.984178 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-5tfd6"] Mar 20 11:50:00 crc kubenswrapper[4695]: I0320 11:50:00.989243 4695 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:50:01 crc kubenswrapper[4695]: I0320 11:50:01.890011 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" event={"ID":"0f2923e7-5697-4fc2-af21-f9015330dccf","Type":"ContainerStarted","Data":"9e0bd43bc5f11464888bb9be85a17f277e02122a3762f70736118c4d7615622c"} Mar 20 11:50:02 crc kubenswrapper[4695]: I0320 11:50:02.900750 4695 generic.go:334] "Generic (PLEG): container finished" podID="0f2923e7-5697-4fc2-af21-f9015330dccf" containerID="72d9e1717639feccd28df2c1c8717955130c0b7772f79e3cad3a860d0566776e" exitCode=0 Mar 20 11:50:02 crc kubenswrapper[4695]: I0320 11:50:02.900769 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" event={"ID":"0f2923e7-5697-4fc2-af21-f9015330dccf","Type":"ContainerDied","Data":"72d9e1717639feccd28df2c1c8717955130c0b7772f79e3cad3a860d0566776e"} Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.201114 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.312159 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm45b\" (UniqueName: \"kubernetes.io/projected/0f2923e7-5697-4fc2-af21-f9015330dccf-kube-api-access-gm45b\") pod \"0f2923e7-5697-4fc2-af21-f9015330dccf\" (UID: \"0f2923e7-5697-4fc2-af21-f9015330dccf\") " Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.328335 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f2923e7-5697-4fc2-af21-f9015330dccf-kube-api-access-gm45b" (OuterVolumeSpecName: "kube-api-access-gm45b") pod "0f2923e7-5697-4fc2-af21-f9015330dccf" (UID: "0f2923e7-5697-4fc2-af21-f9015330dccf"). InnerVolumeSpecName "kube-api-access-gm45b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.413813 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm45b\" (UniqueName: \"kubernetes.io/projected/0f2923e7-5697-4fc2-af21-f9015330dccf-kube-api-access-gm45b\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.919580 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" event={"ID":"0f2923e7-5697-4fc2-af21-f9015330dccf","Type":"ContainerDied","Data":"9e0bd43bc5f11464888bb9be85a17f277e02122a3762f70736118c4d7615622c"} Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.919642 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e0bd43bc5f11464888bb9be85a17f277e02122a3762f70736118c4d7615622c" Mar 20 11:50:04 crc kubenswrapper[4695]: I0320 11:50:04.919643 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-5tfd6" Mar 20 11:50:05 crc kubenswrapper[4695]: I0320 11:50:05.288098 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-nsc7l"] Mar 20 11:50:05 crc kubenswrapper[4695]: I0320 11:50:05.294290 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-nsc7l"] Mar 20 11:50:06 crc kubenswrapper[4695]: I0320 11:50:06.897612 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8a0bbf6-76f8-425b-bd9a-d834207d3ac0" path="/var/lib/kubelet/pods/a8a0bbf6-76f8-425b-bd9a-d834207d3ac0/volumes" Mar 20 11:50:07 crc kubenswrapper[4695]: I0320 11:50:07.632710 4695 scope.go:117] "RemoveContainer" containerID="e6bcfd88e3748f635146c21ab6125c935129598a21c6023646137cad9034dd07" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.231331 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vlmrq"] Mar 20 11:50:11 crc kubenswrapper[4695]: E0320 11:50:11.232799 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f2923e7-5697-4fc2-af21-f9015330dccf" containerName="oc" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.232820 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f2923e7-5697-4fc2-af21-f9015330dccf" containerName="oc" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.232999 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f2923e7-5697-4fc2-af21-f9015330dccf" containerName="oc" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.234377 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.244008 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vlmrq"] Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.327010 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djldc\" (UniqueName: \"kubernetes.io/projected/280673d1-e13a-40e9-a52e-62d401c38855-kube-api-access-djldc\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.327077 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-utilities\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.327144 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-catalog-content\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.429233 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-utilities\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.429330 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-catalog-content\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.429479 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djldc\" (UniqueName: \"kubernetes.io/projected/280673d1-e13a-40e9-a52e-62d401c38855-kube-api-access-djldc\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.429979 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-utilities\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.430053 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-catalog-content\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.454105 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djldc\" (UniqueName: \"kubernetes.io/projected/280673d1-e13a-40e9-a52e-62d401c38855-kube-api-access-djldc\") pod \"certified-operators-vlmrq\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.560793 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:11 crc kubenswrapper[4695]: W0320 11:50:11.883092 4695 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod280673d1_e13a_40e9_a52e_62d401c38855.slice/crio-dde36b2c5a8d78b4ddb791d394bd140ccda2d310cb68205c83df679a51ff624b WatchSource:0}: Error finding container dde36b2c5a8d78b4ddb791d394bd140ccda2d310cb68205c83df679a51ff624b: Status 404 returned error can't find the container with id dde36b2c5a8d78b4ddb791d394bd140ccda2d310cb68205c83df679a51ff624b Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.897282 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vlmrq"] Mar 20 11:50:11 crc kubenswrapper[4695]: I0320 11:50:11.984003 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerStarted","Data":"dde36b2c5a8d78b4ddb791d394bd140ccda2d310cb68205c83df679a51ff624b"} Mar 20 11:50:13 crc kubenswrapper[4695]: I0320 11:50:13.009223 4695 generic.go:334] "Generic (PLEG): container finished" podID="280673d1-e13a-40e9-a52e-62d401c38855" containerID="38510cdbcc8318bed6f4176c3ce932bb9ab4363f43430744324b495b52b7b477" exitCode=0 Mar 20 11:50:13 crc kubenswrapper[4695]: I0320 11:50:13.009358 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerDied","Data":"38510cdbcc8318bed6f4176c3ce932bb9ab4363f43430744324b495b52b7b477"} Mar 20 11:50:14 crc kubenswrapper[4695]: I0320 11:50:14.021069 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerStarted","Data":"cb52a22a64ebd6f59023082a0253fed1d18e0a32f632944ac1537d3ad83fdd2a"} Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.033039 4695 generic.go:334] "Generic (PLEG): container finished" podID="280673d1-e13a-40e9-a52e-62d401c38855" containerID="cb52a22a64ebd6f59023082a0253fed1d18e0a32f632944ac1537d3ad83fdd2a" exitCode=0 Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.033138 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerDied","Data":"cb52a22a64ebd6f59023082a0253fed1d18e0a32f632944ac1537d3ad83fdd2a"} Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.472301 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gmrqb"] Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.474497 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.500102 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gmrqb"] Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.602201 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-catalog-content\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.602339 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-utilities\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.602426 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7dc\" (UniqueName: \"kubernetes.io/projected/118d1f4c-12a9-492f-a297-28b139b9bb35-kube-api-access-7q7dc\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.704623 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-catalog-content\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.703982 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-catalog-content\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.705164 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-utilities\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.705301 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-utilities\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.705405 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7dc\" (UniqueName: \"kubernetes.io/projected/118d1f4c-12a9-492f-a297-28b139b9bb35-kube-api-access-7q7dc\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.727797 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7dc\" (UniqueName: \"kubernetes.io/projected/118d1f4c-12a9-492f-a297-28b139b9bb35-kube-api-access-7q7dc\") pod \"redhat-operators-gmrqb\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:15 crc kubenswrapper[4695]: I0320 11:50:15.870394 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:16 crc kubenswrapper[4695]: I0320 11:50:16.076387 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerStarted","Data":"cb5ecdfbf96f15ac3bb007ebb0f598469a8284f24f05492492bab438d93be6db"} Mar 20 11:50:16 crc kubenswrapper[4695]: I0320 11:50:16.110953 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vlmrq" podStartSLOduration=2.644990085 podStartE2EDuration="5.110873145s" podCreationTimestamp="2026-03-20 11:50:11 +0000 UTC" firstStartedPulling="2026-03-20 11:50:13.011297719 +0000 UTC m=+3390.791903282" lastFinishedPulling="2026-03-20 11:50:15.477180779 +0000 UTC m=+3393.257786342" observedRunningTime="2026-03-20 11:50:16.10989116 +0000 UTC m=+3393.890496753" watchObservedRunningTime="2026-03-20 11:50:16.110873145 +0000 UTC m=+3393.891478708" Mar 20 11:50:16 crc kubenswrapper[4695]: I0320 11:50:16.446561 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gmrqb"] Mar 20 11:50:17 crc kubenswrapper[4695]: I0320 11:50:17.086085 4695 generic.go:334] "Generic (PLEG): container finished" podID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerID="e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384" exitCode=0 Mar 20 11:50:17 crc kubenswrapper[4695]: I0320 11:50:17.086225 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerDied","Data":"e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384"} Mar 20 11:50:17 crc kubenswrapper[4695]: I0320 11:50:17.086671 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerStarted","Data":"807a211dee4ead410f3129b723371d194ede023942eac706dba6f7885d64347f"} Mar 20 11:50:18 crc kubenswrapper[4695]: I0320 11:50:18.099137 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerStarted","Data":"1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f"} Mar 20 11:50:19 crc kubenswrapper[4695]: I0320 11:50:19.113874 4695 generic.go:334] "Generic (PLEG): container finished" podID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerID="1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f" exitCode=0 Mar 20 11:50:19 crc kubenswrapper[4695]: I0320 11:50:19.113974 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerDied","Data":"1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f"} Mar 20 11:50:20 crc kubenswrapper[4695]: I0320 11:50:20.124630 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerStarted","Data":"42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f"} Mar 20 11:50:20 crc kubenswrapper[4695]: I0320 11:50:20.158198 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gmrqb" podStartSLOduration=2.718954733 podStartE2EDuration="5.158150124s" podCreationTimestamp="2026-03-20 11:50:15 +0000 UTC" firstStartedPulling="2026-03-20 11:50:17.088592902 +0000 UTC m=+3394.869198465" lastFinishedPulling="2026-03-20 11:50:19.527788293 +0000 UTC m=+3397.308393856" observedRunningTime="2026-03-20 11:50:20.147149004 +0000 UTC m=+3397.927754567" watchObservedRunningTime="2026-03-20 11:50:20.158150124 +0000 UTC m=+3397.938755687" Mar 20 11:50:21 crc kubenswrapper[4695]: I0320 11:50:21.560994 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:21 crc kubenswrapper[4695]: I0320 11:50:21.561495 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:21 crc kubenswrapper[4695]: I0320 11:50:21.628716 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:22 crc kubenswrapper[4695]: I0320 11:50:22.184363 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:22 crc kubenswrapper[4695]: I0320 11:50:22.635745 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vlmrq"] Mar 20 11:50:24 crc kubenswrapper[4695]: I0320 11:50:24.154180 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vlmrq" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="registry-server" containerID="cri-o://cb5ecdfbf96f15ac3bb007ebb0f598469a8284f24f05492492bab438d93be6db" gracePeriod=2 Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.175737 4695 generic.go:334] "Generic (PLEG): container finished" podID="280673d1-e13a-40e9-a52e-62d401c38855" containerID="cb5ecdfbf96f15ac3bb007ebb0f598469a8284f24f05492492bab438d93be6db" exitCode=0 Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.176115 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerDied","Data":"cb5ecdfbf96f15ac3bb007ebb0f598469a8284f24f05492492bab438d93be6db"} Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.309926 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.457464 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djldc\" (UniqueName: \"kubernetes.io/projected/280673d1-e13a-40e9-a52e-62d401c38855-kube-api-access-djldc\") pod \"280673d1-e13a-40e9-a52e-62d401c38855\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.457524 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-utilities\") pod \"280673d1-e13a-40e9-a52e-62d401c38855\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.457594 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-catalog-content\") pod \"280673d1-e13a-40e9-a52e-62d401c38855\" (UID: \"280673d1-e13a-40e9-a52e-62d401c38855\") " Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.458895 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-utilities" (OuterVolumeSpecName: "utilities") pod "280673d1-e13a-40e9-a52e-62d401c38855" (UID: "280673d1-e13a-40e9-a52e-62d401c38855"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.465896 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/280673d1-e13a-40e9-a52e-62d401c38855-kube-api-access-djldc" (OuterVolumeSpecName: "kube-api-access-djldc") pod "280673d1-e13a-40e9-a52e-62d401c38855" (UID: "280673d1-e13a-40e9-a52e-62d401c38855"). InnerVolumeSpecName "kube-api-access-djldc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.515326 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "280673d1-e13a-40e9-a52e-62d401c38855" (UID: "280673d1-e13a-40e9-a52e-62d401c38855"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.559618 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.559669 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djldc\" (UniqueName: \"kubernetes.io/projected/280673d1-e13a-40e9-a52e-62d401c38855-kube-api-access-djldc\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.559685 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/280673d1-e13a-40e9-a52e-62d401c38855-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.872029 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:25 crc kubenswrapper[4695]: I0320 11:50:25.872174 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.187145 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vlmrq" event={"ID":"280673d1-e13a-40e9-a52e-62d401c38855","Type":"ContainerDied","Data":"dde36b2c5a8d78b4ddb791d394bd140ccda2d310cb68205c83df679a51ff624b"} Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.187583 4695 scope.go:117] "RemoveContainer" containerID="cb5ecdfbf96f15ac3bb007ebb0f598469a8284f24f05492492bab438d93be6db" Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.187728 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vlmrq" Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.213515 4695 scope.go:117] "RemoveContainer" containerID="cb52a22a64ebd6f59023082a0253fed1d18e0a32f632944ac1537d3ad83fdd2a" Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.217026 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vlmrq"] Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.234973 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vlmrq"] Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.249730 4695 scope.go:117] "RemoveContainer" containerID="38510cdbcc8318bed6f4176c3ce932bb9ab4363f43430744324b495b52b7b477" Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.895492 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="280673d1-e13a-40e9-a52e-62d401c38855" path="/var/lib/kubelet/pods/280673d1-e13a-40e9-a52e-62d401c38855/volumes" Mar 20 11:50:26 crc kubenswrapper[4695]: I0320 11:50:26.916473 4695 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gmrqb" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="registry-server" probeResult="failure" output=< Mar 20 11:50:26 crc kubenswrapper[4695]: timeout: failed to connect service ":50051" within 1s Mar 20 11:50:26 crc kubenswrapper[4695]: > Mar 20 11:50:35 crc kubenswrapper[4695]: I0320 11:50:35.922634 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:35 crc kubenswrapper[4695]: I0320 11:50:35.975056 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:36 crc kubenswrapper[4695]: I0320 11:50:36.169261 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gmrqb"] Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.295779 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gmrqb" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="registry-server" containerID="cri-o://42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f" gracePeriod=2 Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.736561 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.771425 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-catalog-content\") pod \"118d1f4c-12a9-492f-a297-28b139b9bb35\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.771491 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-utilities\") pod \"118d1f4c-12a9-492f-a297-28b139b9bb35\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.771653 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7q7dc\" (UniqueName: \"kubernetes.io/projected/118d1f4c-12a9-492f-a297-28b139b9bb35-kube-api-access-7q7dc\") pod \"118d1f4c-12a9-492f-a297-28b139b9bb35\" (UID: \"118d1f4c-12a9-492f-a297-28b139b9bb35\") " Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.780532 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-utilities" (OuterVolumeSpecName: "utilities") pod "118d1f4c-12a9-492f-a297-28b139b9bb35" (UID: "118d1f4c-12a9-492f-a297-28b139b9bb35"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.783288 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118d1f4c-12a9-492f-a297-28b139b9bb35-kube-api-access-7q7dc" (OuterVolumeSpecName: "kube-api-access-7q7dc") pod "118d1f4c-12a9-492f-a297-28b139b9bb35" (UID: "118d1f4c-12a9-492f-a297-28b139b9bb35"). InnerVolumeSpecName "kube-api-access-7q7dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.873135 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7q7dc\" (UniqueName: \"kubernetes.io/projected/118d1f4c-12a9-492f-a297-28b139b9bb35-kube-api-access-7q7dc\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.873460 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.925890 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "118d1f4c-12a9-492f-a297-28b139b9bb35" (UID: "118d1f4c-12a9-492f-a297-28b139b9bb35"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:50:37 crc kubenswrapper[4695]: I0320 11:50:37.976155 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/118d1f4c-12a9-492f-a297-28b139b9bb35-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.309005 4695 generic.go:334] "Generic (PLEG): container finished" podID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerID="42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f" exitCode=0 Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.309080 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerDied","Data":"42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f"} Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.309130 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gmrqb" event={"ID":"118d1f4c-12a9-492f-a297-28b139b9bb35","Type":"ContainerDied","Data":"807a211dee4ead410f3129b723371d194ede023942eac706dba6f7885d64347f"} Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.309133 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gmrqb" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.309153 4695 scope.go:117] "RemoveContainer" containerID="42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.335574 4695 scope.go:117] "RemoveContainer" containerID="1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.357103 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gmrqb"] Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.367036 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gmrqb"] Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.370600 4695 scope.go:117] "RemoveContainer" containerID="e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.398354 4695 scope.go:117] "RemoveContainer" containerID="42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f" Mar 20 11:50:38 crc kubenswrapper[4695]: E0320 11:50:38.399145 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f\": container with ID starting with 42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f not found: ID does not exist" containerID="42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.399206 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f"} err="failed to get container status \"42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f\": rpc error: code = NotFound desc = could not find container \"42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f\": container with ID starting with 42155690e38958354fa0cb4d0dca49c0c54570b7427819f3e4b795d7b5ab4d4f not found: ID does not exist" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.399250 4695 scope.go:117] "RemoveContainer" containerID="1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f" Mar 20 11:50:38 crc kubenswrapper[4695]: E0320 11:50:38.400166 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f\": container with ID starting with 1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f not found: ID does not exist" containerID="1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.400224 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f"} err="failed to get container status \"1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f\": rpc error: code = NotFound desc = could not find container \"1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f\": container with ID starting with 1abcb4043fff9e476f32b545c6c902f8f3d5755310176521e601b1f6c25a928f not found: ID does not exist" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.400272 4695 scope.go:117] "RemoveContainer" containerID="e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384" Mar 20 11:50:38 crc kubenswrapper[4695]: E0320 11:50:38.400636 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384\": container with ID starting with e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384 not found: ID does not exist" containerID="e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.400682 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384"} err="failed to get container status \"e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384\": rpc error: code = NotFound desc = could not find container \"e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384\": container with ID starting with e029c02eee990bbf7249ea282a61c05306810f64635c41ac64eb550b1e4c1384 not found: ID does not exist" Mar 20 11:50:38 crc kubenswrapper[4695]: I0320 11:50:38.897200 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" path="/var/lib/kubelet/pods/118d1f4c-12a9-492f-a297-28b139b9bb35/volumes" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.585001 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-55pmz"] Mar 20 11:50:44 crc kubenswrapper[4695]: E0320 11:50:44.586271 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="extract-utilities" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.586290 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="extract-utilities" Mar 20 11:50:44 crc kubenswrapper[4695]: E0320 11:50:44.586307 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="registry-server" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.586322 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="registry-server" Mar 20 11:50:44 crc kubenswrapper[4695]: E0320 11:50:44.586339 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="registry-server" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.586346 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="registry-server" Mar 20 11:50:44 crc kubenswrapper[4695]: E0320 11:50:44.586361 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="extract-content" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.586368 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="extract-content" Mar 20 11:50:44 crc kubenswrapper[4695]: E0320 11:50:44.586387 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="extract-content" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.586394 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="extract-content" Mar 20 11:50:44 crc kubenswrapper[4695]: E0320 11:50:44.586410 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="extract-utilities" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.586417 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="extract-utilities" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.591326 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="280673d1-e13a-40e9-a52e-62d401c38855" containerName="registry-server" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.591435 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="118d1f4c-12a9-492f-a297-28b139b9bb35" containerName="registry-server" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.593297 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.594210 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-55pmz"] Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.746963 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-utilities\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.747026 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-catalog-content\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.747059 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2l5z\" (UniqueName: \"kubernetes.io/projected/b5addb4d-19b8-4d92-8bab-a43f2326a88b-kube-api-access-q2l5z\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.849861 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-utilities\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.849941 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-catalog-content\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.849984 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2l5z\" (UniqueName: \"kubernetes.io/projected/b5addb4d-19b8-4d92-8bab-a43f2326a88b-kube-api-access-q2l5z\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.850469 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-utilities\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.851105 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-catalog-content\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.873844 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2l5z\" (UniqueName: \"kubernetes.io/projected/b5addb4d-19b8-4d92-8bab-a43f2326a88b-kube-api-access-q2l5z\") pod \"redhat-marketplace-55pmz\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:44 crc kubenswrapper[4695]: I0320 11:50:44.914330 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:45 crc kubenswrapper[4695]: I0320 11:50:45.947133 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-55pmz"] Mar 20 11:50:46 crc kubenswrapper[4695]: I0320 11:50:46.386343 4695 generic.go:334] "Generic (PLEG): container finished" podID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerID="13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036" exitCode=0 Mar 20 11:50:46 crc kubenswrapper[4695]: I0320 11:50:46.386446 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55pmz" event={"ID":"b5addb4d-19b8-4d92-8bab-a43f2326a88b","Type":"ContainerDied","Data":"13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036"} Mar 20 11:50:46 crc kubenswrapper[4695]: I0320 11:50:46.386802 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55pmz" event={"ID":"b5addb4d-19b8-4d92-8bab-a43f2326a88b","Type":"ContainerStarted","Data":"6b53fe100954c9d0a12dfb865d10a6d252a9e5ed8322a14dcf2aa2dd2ca6a2b1"} Mar 20 11:50:48 crc kubenswrapper[4695]: I0320 11:50:48.408489 4695 generic.go:334] "Generic (PLEG): container finished" podID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerID="4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4" exitCode=0 Mar 20 11:50:48 crc kubenswrapper[4695]: I0320 11:50:48.409336 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55pmz" event={"ID":"b5addb4d-19b8-4d92-8bab-a43f2326a88b","Type":"ContainerDied","Data":"4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4"} Mar 20 11:50:50 crc kubenswrapper[4695]: I0320 11:50:50.511273 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55pmz" event={"ID":"b5addb4d-19b8-4d92-8bab-a43f2326a88b","Type":"ContainerStarted","Data":"fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182"} Mar 20 11:50:50 crc kubenswrapper[4695]: I0320 11:50:50.534926 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-55pmz" podStartSLOduration=4.077330256 podStartE2EDuration="6.534884174s" podCreationTimestamp="2026-03-20 11:50:44 +0000 UTC" firstStartedPulling="2026-03-20 11:50:46.38845006 +0000 UTC m=+3424.169055623" lastFinishedPulling="2026-03-20 11:50:48.846003968 +0000 UTC m=+3426.626609541" observedRunningTime="2026-03-20 11:50:50.533959891 +0000 UTC m=+3428.314565454" watchObservedRunningTime="2026-03-20 11:50:50.534884174 +0000 UTC m=+3428.315489737" Mar 20 11:50:54 crc kubenswrapper[4695]: I0320 11:50:54.915141 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:54 crc kubenswrapper[4695]: I0320 11:50:54.915933 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:54 crc kubenswrapper[4695]: I0320 11:50:54.978105 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:55 crc kubenswrapper[4695]: I0320 11:50:55.622884 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:55 crc kubenswrapper[4695]: I0320 11:50:55.684004 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-55pmz"] Mar 20 11:50:57 crc kubenswrapper[4695]: I0320 11:50:57.576355 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-55pmz" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="registry-server" containerID="cri-o://fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182" gracePeriod=2 Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.003346 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.145999 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2l5z\" (UniqueName: \"kubernetes.io/projected/b5addb4d-19b8-4d92-8bab-a43f2326a88b-kube-api-access-q2l5z\") pod \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.146164 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-utilities\") pod \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.146185 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-catalog-content\") pod \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\" (UID: \"b5addb4d-19b8-4d92-8bab-a43f2326a88b\") " Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.147229 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-utilities" (OuterVolumeSpecName: "utilities") pod "b5addb4d-19b8-4d92-8bab-a43f2326a88b" (UID: "b5addb4d-19b8-4d92-8bab-a43f2326a88b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.148186 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.155426 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5addb4d-19b8-4d92-8bab-a43f2326a88b-kube-api-access-q2l5z" (OuterVolumeSpecName: "kube-api-access-q2l5z") pod "b5addb4d-19b8-4d92-8bab-a43f2326a88b" (UID: "b5addb4d-19b8-4d92-8bab-a43f2326a88b"). InnerVolumeSpecName "kube-api-access-q2l5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.238385 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5addb4d-19b8-4d92-8bab-a43f2326a88b" (UID: "b5addb4d-19b8-4d92-8bab-a43f2326a88b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.248941 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5addb4d-19b8-4d92-8bab-a43f2326a88b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.249039 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2l5z\" (UniqueName: \"kubernetes.io/projected/b5addb4d-19b8-4d92-8bab-a43f2326a88b-kube-api-access-q2l5z\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.587035 4695 generic.go:334] "Generic (PLEG): container finished" podID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerID="fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182" exitCode=0 Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.587117 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55pmz" event={"ID":"b5addb4d-19b8-4d92-8bab-a43f2326a88b","Type":"ContainerDied","Data":"fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182"} Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.587157 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-55pmz" event={"ID":"b5addb4d-19b8-4d92-8bab-a43f2326a88b","Type":"ContainerDied","Data":"6b53fe100954c9d0a12dfb865d10a6d252a9e5ed8322a14dcf2aa2dd2ca6a2b1"} Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.587175 4695 scope.go:117] "RemoveContainer" containerID="fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.587174 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-55pmz" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.626301 4695 scope.go:117] "RemoveContainer" containerID="4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.629563 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-55pmz"] Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.638309 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-55pmz"] Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.646255 4695 scope.go:117] "RemoveContainer" containerID="13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.672901 4695 scope.go:117] "RemoveContainer" containerID="fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182" Mar 20 11:50:58 crc kubenswrapper[4695]: E0320 11:50:58.673461 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182\": container with ID starting with fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182 not found: ID does not exist" containerID="fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.673510 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182"} err="failed to get container status \"fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182\": rpc error: code = NotFound desc = could not find container \"fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182\": container with ID starting with fce903c79728e9496c9f7bf0ac538061b2fdfe530806046dffe8fdff1c195182 not found: ID does not exist" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.673541 4695 scope.go:117] "RemoveContainer" containerID="4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4" Mar 20 11:50:58 crc kubenswrapper[4695]: E0320 11:50:58.673945 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4\": container with ID starting with 4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4 not found: ID does not exist" containerID="4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.673980 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4"} err="failed to get container status \"4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4\": rpc error: code = NotFound desc = could not find container \"4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4\": container with ID starting with 4328477cfff42bf9839a6dbe666534b4b8f257b72d0f1cfd702c3f1f815bb8c4 not found: ID does not exist" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.674002 4695 scope.go:117] "RemoveContainer" containerID="13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036" Mar 20 11:50:58 crc kubenswrapper[4695]: E0320 11:50:58.674287 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036\": container with ID starting with 13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036 not found: ID does not exist" containerID="13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.674324 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036"} err="failed to get container status \"13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036\": rpc error: code = NotFound desc = could not find container \"13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036\": container with ID starting with 13789b385914dbc07878b8e46885de03bb913d27885a83afc4b0f3a8e6f18036 not found: ID does not exist" Mar 20 11:50:58 crc kubenswrapper[4695]: I0320 11:50:58.896552 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" path="/var/lib/kubelet/pods/b5addb4d-19b8-4d92-8bab-a43f2326a88b/volumes" Mar 20 11:51:08 crc kubenswrapper[4695]: I0320 11:51:08.431253 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:51:08 crc kubenswrapper[4695]: I0320 11:51:08.431961 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:51:38 crc kubenswrapper[4695]: I0320 11:51:38.431273 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:51:38 crc kubenswrapper[4695]: I0320 11:51:38.432071 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.158774 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566792-bpcrd"] Mar 20 11:52:00 crc kubenswrapper[4695]: E0320 11:52:00.159973 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="extract-content" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.159990 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="extract-content" Mar 20 11:52:00 crc kubenswrapper[4695]: E0320 11:52:00.160013 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="registry-server" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.160033 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="registry-server" Mar 20 11:52:00 crc kubenswrapper[4695]: E0320 11:52:00.160045 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="extract-utilities" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.160055 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="extract-utilities" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.160248 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5addb4d-19b8-4d92-8bab-a43f2326a88b" containerName="registry-server" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.161006 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.164069 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.164422 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.164732 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.165874 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-bpcrd"] Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.273777 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skd9j\" (UniqueName: \"kubernetes.io/projected/61195863-60cd-4365-acce-2ff9278436f5-kube-api-access-skd9j\") pod \"auto-csr-approver-29566792-bpcrd\" (UID: \"61195863-60cd-4365-acce-2ff9278436f5\") " pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.375115 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skd9j\" (UniqueName: \"kubernetes.io/projected/61195863-60cd-4365-acce-2ff9278436f5-kube-api-access-skd9j\") pod \"auto-csr-approver-29566792-bpcrd\" (UID: \"61195863-60cd-4365-acce-2ff9278436f5\") " pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.398501 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skd9j\" (UniqueName: \"kubernetes.io/projected/61195863-60cd-4365-acce-2ff9278436f5-kube-api-access-skd9j\") pod \"auto-csr-approver-29566792-bpcrd\" (UID: \"61195863-60cd-4365-acce-2ff9278436f5\") " pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.487265 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:00 crc kubenswrapper[4695]: I0320 11:52:00.919538 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566792-bpcrd"] Mar 20 11:52:01 crc kubenswrapper[4695]: I0320 11:52:01.051680 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" event={"ID":"61195863-60cd-4365-acce-2ff9278436f5","Type":"ContainerStarted","Data":"410e4b2ab63f5cc570fe2ddf5df3232995f1b7fb5d55dc49b6994dd00efea19b"} Mar 20 11:52:03 crc kubenswrapper[4695]: I0320 11:52:03.066740 4695 generic.go:334] "Generic (PLEG): container finished" podID="61195863-60cd-4365-acce-2ff9278436f5" containerID="6789a6e971ea2bfe411f925d5c9fe6f61fd64fb966361f4c6cc40acecaa96d2b" exitCode=0 Mar 20 11:52:03 crc kubenswrapper[4695]: I0320 11:52:03.068086 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" event={"ID":"61195863-60cd-4365-acce-2ff9278436f5","Type":"ContainerDied","Data":"6789a6e971ea2bfe411f925d5c9fe6f61fd64fb966361f4c6cc40acecaa96d2b"} Mar 20 11:52:04 crc kubenswrapper[4695]: I0320 11:52:04.338112 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:04 crc kubenswrapper[4695]: I0320 11:52:04.537779 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skd9j\" (UniqueName: \"kubernetes.io/projected/61195863-60cd-4365-acce-2ff9278436f5-kube-api-access-skd9j\") pod \"61195863-60cd-4365-acce-2ff9278436f5\" (UID: \"61195863-60cd-4365-acce-2ff9278436f5\") " Mar 20 11:52:04 crc kubenswrapper[4695]: I0320 11:52:04.548221 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61195863-60cd-4365-acce-2ff9278436f5-kube-api-access-skd9j" (OuterVolumeSpecName: "kube-api-access-skd9j") pod "61195863-60cd-4365-acce-2ff9278436f5" (UID: "61195863-60cd-4365-acce-2ff9278436f5"). InnerVolumeSpecName "kube-api-access-skd9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:52:04 crc kubenswrapper[4695]: I0320 11:52:04.640089 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skd9j\" (UniqueName: \"kubernetes.io/projected/61195863-60cd-4365-acce-2ff9278436f5-kube-api-access-skd9j\") on node \"crc\" DevicePath \"\"" Mar 20 11:52:05 crc kubenswrapper[4695]: I0320 11:52:05.090157 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" event={"ID":"61195863-60cd-4365-acce-2ff9278436f5","Type":"ContainerDied","Data":"410e4b2ab63f5cc570fe2ddf5df3232995f1b7fb5d55dc49b6994dd00efea19b"} Mar 20 11:52:05 crc kubenswrapper[4695]: I0320 11:52:05.090220 4695 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="410e4b2ab63f5cc570fe2ddf5df3232995f1b7fb5d55dc49b6994dd00efea19b" Mar 20 11:52:05 crc kubenswrapper[4695]: I0320 11:52:05.090291 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566792-bpcrd" Mar 20 11:52:05 crc kubenswrapper[4695]: I0320 11:52:05.419242 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-2qpvb"] Mar 20 11:52:05 crc kubenswrapper[4695]: I0320 11:52:05.426569 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-2qpvb"] Mar 20 11:52:06 crc kubenswrapper[4695]: I0320 11:52:06.897860 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c827f5a-671e-4fbd-acac-bb6029906f10" path="/var/lib/kubelet/pods/5c827f5a-671e-4fbd-acac-bb6029906f10/volumes" Mar 20 11:52:07 crc kubenswrapper[4695]: I0320 11:52:07.800116 4695 scope.go:117] "RemoveContainer" containerID="0697f9b13d8e10ac3e3ad68e61de3146506c09e3896dd149535a3744d87786d0" Mar 20 11:52:08 crc kubenswrapper[4695]: I0320 11:52:08.431241 4695 patch_prober.go:28] interesting pod/machine-config-daemon-bnwz5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:52:08 crc kubenswrapper[4695]: I0320 11:52:08.431734 4695 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:52:08 crc kubenswrapper[4695]: I0320 11:52:08.431795 4695 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" Mar 20 11:52:08 crc kubenswrapper[4695]: I0320 11:52:08.432701 4695 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a3ae36c199c203aaa8df730b6a5da70999a35dbcf7d29a69b9177c6281627a5"} pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:52:08 crc kubenswrapper[4695]: I0320 11:52:08.432773 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" podUID="7859c924-84d7-4855-901e-c77a02c56e3a" containerName="machine-config-daemon" containerID="cri-o://7a3ae36c199c203aaa8df730b6a5da70999a35dbcf7d29a69b9177c6281627a5" gracePeriod=600 Mar 20 11:52:09 crc kubenswrapper[4695]: I0320 11:52:09.127698 4695 generic.go:334] "Generic (PLEG): container finished" podID="7859c924-84d7-4855-901e-c77a02c56e3a" containerID="7a3ae36c199c203aaa8df730b6a5da70999a35dbcf7d29a69b9177c6281627a5" exitCode=0 Mar 20 11:52:09 crc kubenswrapper[4695]: I0320 11:52:09.127823 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerDied","Data":"7a3ae36c199c203aaa8df730b6a5da70999a35dbcf7d29a69b9177c6281627a5"} Mar 20 11:52:09 crc kubenswrapper[4695]: I0320 11:52:09.128249 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-bnwz5" event={"ID":"7859c924-84d7-4855-901e-c77a02c56e3a","Type":"ContainerStarted","Data":"f54759fa725e37900fb69339efe2c3a6f8dd9aef3cd872a37cae005966527a63"} Mar 20 11:52:09 crc kubenswrapper[4695]: I0320 11:52:09.128280 4695 scope.go:117] "RemoveContainer" containerID="9241da4463d954f75a73e9af61223602bd89c45e41ff89e8b737f1c8c56e0683" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.334868 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q8drr"] Mar 20 11:53:18 crc kubenswrapper[4695]: E0320 11:53:18.336162 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61195863-60cd-4365-acce-2ff9278436f5" containerName="oc" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.336181 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="61195863-60cd-4365-acce-2ff9278436f5" containerName="oc" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.336375 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="61195863-60cd-4365-acce-2ff9278436f5" containerName="oc" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.337691 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.349211 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q8drr"] Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.470855 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-catalog-content\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.471321 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-utilities\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.471460 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57jsn\" (UniqueName: \"kubernetes.io/projected/f2909b97-5589-4705-bead-c164b10661e6-kube-api-access-57jsn\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.573525 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-catalog-content\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.573882 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-utilities\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.574058 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57jsn\" (UniqueName: \"kubernetes.io/projected/f2909b97-5589-4705-bead-c164b10661e6-kube-api-access-57jsn\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.574225 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-catalog-content\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.574326 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-utilities\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.609506 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57jsn\" (UniqueName: \"kubernetes.io/projected/f2909b97-5589-4705-bead-c164b10661e6-kube-api-access-57jsn\") pod \"community-operators-q8drr\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:18 crc kubenswrapper[4695]: I0320 11:53:18.659705 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:19 crc kubenswrapper[4695]: I0320 11:53:19.199792 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q8drr"] Mar 20 11:53:19 crc kubenswrapper[4695]: I0320 11:53:19.712729 4695 generic.go:334] "Generic (PLEG): container finished" podID="f2909b97-5589-4705-bead-c164b10661e6" containerID="adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2" exitCode=0 Mar 20 11:53:19 crc kubenswrapper[4695]: I0320 11:53:19.712849 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8drr" event={"ID":"f2909b97-5589-4705-bead-c164b10661e6","Type":"ContainerDied","Data":"adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2"} Mar 20 11:53:19 crc kubenswrapper[4695]: I0320 11:53:19.713162 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8drr" event={"ID":"f2909b97-5589-4705-bead-c164b10661e6","Type":"ContainerStarted","Data":"d80f1c9cd009cb030fb9900693de9047e6fd68112f377dd654e47c75045fce47"} Mar 20 11:53:21 crc kubenswrapper[4695]: I0320 11:53:21.729633 4695 generic.go:334] "Generic (PLEG): container finished" podID="f2909b97-5589-4705-bead-c164b10661e6" containerID="237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f" exitCode=0 Mar 20 11:53:21 crc kubenswrapper[4695]: I0320 11:53:21.729680 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8drr" event={"ID":"f2909b97-5589-4705-bead-c164b10661e6","Type":"ContainerDied","Data":"237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f"} Mar 20 11:53:22 crc kubenswrapper[4695]: I0320 11:53:22.738823 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8drr" event={"ID":"f2909b97-5589-4705-bead-c164b10661e6","Type":"ContainerStarted","Data":"6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494"} Mar 20 11:53:22 crc kubenswrapper[4695]: I0320 11:53:22.763381 4695 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q8drr" podStartSLOduration=2.321389312 podStartE2EDuration="4.763360561s" podCreationTimestamp="2026-03-20 11:53:18 +0000 UTC" firstStartedPulling="2026-03-20 11:53:19.714391328 +0000 UTC m=+3577.494996891" lastFinishedPulling="2026-03-20 11:53:22.156362577 +0000 UTC m=+3579.936968140" observedRunningTime="2026-03-20 11:53:22.757535733 +0000 UTC m=+3580.538141296" watchObservedRunningTime="2026-03-20 11:53:22.763360561 +0000 UTC m=+3580.543966114" Mar 20 11:53:28 crc kubenswrapper[4695]: I0320 11:53:28.660765 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:28 crc kubenswrapper[4695]: I0320 11:53:28.661134 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:28 crc kubenswrapper[4695]: I0320 11:53:28.706989 4695 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:28 crc kubenswrapper[4695]: I0320 11:53:28.829164 4695 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:28 crc kubenswrapper[4695]: I0320 11:53:28.940675 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q8drr"] Mar 20 11:53:30 crc kubenswrapper[4695]: I0320 11:53:30.803152 4695 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q8drr" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="registry-server" containerID="cri-o://6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494" gracePeriod=2 Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.165106 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.279124 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57jsn\" (UniqueName: \"kubernetes.io/projected/f2909b97-5589-4705-bead-c164b10661e6-kube-api-access-57jsn\") pod \"f2909b97-5589-4705-bead-c164b10661e6\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.279246 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-catalog-content\") pod \"f2909b97-5589-4705-bead-c164b10661e6\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.279283 4695 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-utilities\") pod \"f2909b97-5589-4705-bead-c164b10661e6\" (UID: \"f2909b97-5589-4705-bead-c164b10661e6\") " Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.280586 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-utilities" (OuterVolumeSpecName: "utilities") pod "f2909b97-5589-4705-bead-c164b10661e6" (UID: "f2909b97-5589-4705-bead-c164b10661e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.292452 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2909b97-5589-4705-bead-c164b10661e6-kube-api-access-57jsn" (OuterVolumeSpecName: "kube-api-access-57jsn") pod "f2909b97-5589-4705-bead-c164b10661e6" (UID: "f2909b97-5589-4705-bead-c164b10661e6"). InnerVolumeSpecName "kube-api-access-57jsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.341600 4695 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2909b97-5589-4705-bead-c164b10661e6" (UID: "f2909b97-5589-4705-bead-c164b10661e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.381149 4695 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.381195 4695 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2909b97-5589-4705-bead-c164b10661e6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.381213 4695 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57jsn\" (UniqueName: \"kubernetes.io/projected/f2909b97-5589-4705-bead-c164b10661e6-kube-api-access-57jsn\") on node \"crc\" DevicePath \"\"" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.811206 4695 generic.go:334] "Generic (PLEG): container finished" podID="f2909b97-5589-4705-bead-c164b10661e6" containerID="6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494" exitCode=0 Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.811252 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8drr" event={"ID":"f2909b97-5589-4705-bead-c164b10661e6","Type":"ContainerDied","Data":"6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494"} Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.811282 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q8drr" event={"ID":"f2909b97-5589-4705-bead-c164b10661e6","Type":"ContainerDied","Data":"d80f1c9cd009cb030fb9900693de9047e6fd68112f377dd654e47c75045fce47"} Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.811298 4695 scope.go:117] "RemoveContainer" containerID="6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.811415 4695 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q8drr" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.846285 4695 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q8drr"] Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.847527 4695 scope.go:117] "RemoveContainer" containerID="237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.852818 4695 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q8drr"] Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.867761 4695 scope.go:117] "RemoveContainer" containerID="adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.897531 4695 scope.go:117] "RemoveContainer" containerID="6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494" Mar 20 11:53:31 crc kubenswrapper[4695]: E0320 11:53:31.898261 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494\": container with ID starting with 6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494 not found: ID does not exist" containerID="6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.898311 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494"} err="failed to get container status \"6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494\": rpc error: code = NotFound desc = could not find container \"6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494\": container with ID starting with 6bacdca490bda379c7113fd17eaeee185cc44e1fba4bc4638eca3f93689b1494 not found: ID does not exist" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.898337 4695 scope.go:117] "RemoveContainer" containerID="237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f" Mar 20 11:53:31 crc kubenswrapper[4695]: E0320 11:53:31.898709 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f\": container with ID starting with 237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f not found: ID does not exist" containerID="237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.898740 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f"} err="failed to get container status \"237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f\": rpc error: code = NotFound desc = could not find container \"237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f\": container with ID starting with 237bbc6ceef617f9d5d2e4c5d1ea547c383e19a4d6c071d98277f116c539563f not found: ID does not exist" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.898759 4695 scope.go:117] "RemoveContainer" containerID="adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2" Mar 20 11:53:31 crc kubenswrapper[4695]: E0320 11:53:31.899208 4695 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2\": container with ID starting with adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2 not found: ID does not exist" containerID="adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2" Mar 20 11:53:31 crc kubenswrapper[4695]: I0320 11:53:31.899237 4695 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2"} err="failed to get container status \"adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2\": rpc error: code = NotFound desc = could not find container \"adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2\": container with ID starting with adfeaed1440eb892d8e2e66bd5b90f8a96a6d55fbefc7f75534f361d566082e2 not found: ID does not exist" Mar 20 11:53:32 crc kubenswrapper[4695]: I0320 11:53:32.898831 4695 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2909b97-5589-4705-bead-c164b10661e6" path="/var/lib/kubelet/pods/f2909b97-5589-4705-bead-c164b10661e6/volumes" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.142616 4695 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566794-ltx9k"] Mar 20 11:54:00 crc kubenswrapper[4695]: E0320 11:54:00.144736 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.144756 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4695]: E0320 11:54:00.144928 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.144941 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="extract-utilities" Mar 20 11:54:00 crc kubenswrapper[4695]: E0320 11:54:00.145025 4695 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.145037 4695 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="extract-content" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.145440 4695 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2909b97-5589-4705-bead-c164b10661e6" containerName="registry-server" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.146088 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-ltx9k" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.150702 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.150864 4695 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-5kqds" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.150993 4695 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.151496 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-ltx9k"] Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.231605 4695 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwk85\" (UniqueName: \"kubernetes.io/projected/40c29142-bcae-40de-95d5-37f679f64753-kube-api-access-rwk85\") pod \"auto-csr-approver-29566794-ltx9k\" (UID: \"40c29142-bcae-40de-95d5-37f679f64753\") " pod="openshift-infra/auto-csr-approver-29566794-ltx9k" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.332781 4695 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwk85\" (UniqueName: \"kubernetes.io/projected/40c29142-bcae-40de-95d5-37f679f64753-kube-api-access-rwk85\") pod \"auto-csr-approver-29566794-ltx9k\" (UID: \"40c29142-bcae-40de-95d5-37f679f64753\") " pod="openshift-infra/auto-csr-approver-29566794-ltx9k" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.355153 4695 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwk85\" (UniqueName: \"kubernetes.io/projected/40c29142-bcae-40de-95d5-37f679f64753-kube-api-access-rwk85\") pod \"auto-csr-approver-29566794-ltx9k\" (UID: \"40c29142-bcae-40de-95d5-37f679f64753\") " pod="openshift-infra/auto-csr-approver-29566794-ltx9k" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.468318 4695 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566794-ltx9k" Mar 20 11:54:00 crc kubenswrapper[4695]: I0320 11:54:00.886947 4695 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566794-ltx9k"] Mar 20 11:54:01 crc kubenswrapper[4695]: I0320 11:54:01.023658 4695 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566794-ltx9k" event={"ID":"40c29142-bcae-40de-95d5-37f679f64753","Type":"ContainerStarted","Data":"aef35ec62b724a49e633427df1e00fc2c97596907a3503b928c4f10acab8f512"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157232544024455 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157232545017373 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157223137016513 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157223137015463 5ustar corecore